In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected] Read more from this blog.
(This is the final post in a three-part series. You can see Part One here and Part Two here.)
The question of the week is:
How do you think artificial intelligence-powered tools like ChatGPT are going to affect K-12 schools, and what are practical strategies teachers can use to respond to them?
Brett Vogelsinger, Gina Parnaby, and TJ Wilson kicked off the series.
In Part Two, Susan Barber, Andrew Cohen, Elizabeth Matheny, and Amanda Kremnitzer contributed their ideas.
Today, I share my own reflections, as do Samantha Parker, Kelli McGraw, and Nick Kelly.
Over the past month, countless articles and columns have been written about ChatGPT and its potential effects on the classroom (you can see a collection of them at The Best Posts on Education & ChatGPT and the previous posts in this series).
I’m not sure I have any particularly novel insights, but here is my present thinking on the topic (which is subject to change anytime over the next hours, days, weeks, and months):
ChatGPT and other artificial intelligence advancements are going to require many of us to substantially adjust elements of how we teach. Though its present version often makes fairly easy mistakes to catch (like citing sources that don’t exist), it’s a safe bet that near-future versions will be much more sophisticated. And, even though there are likely to be some tools that might be able to detect text that is primarily written by AI, it’s also likely that these will always be one step behind the latest-and-greatest iteration of text-generating AI applications. Do we really want to create additional stress on our teachers’ lives by ineffectively policing student writing?
In addition, it’s extremely likely, if not certain, that AI writing in many facets of personal and professional life will become ubiquitous. There’s an old community organizer’s saying about “living in the world as it is, not as we would like it to be.” Shouldn’t we put energy into preparing our students for the writing of the future and not the past, just as math teachers made adjustments after the invention of the pocket calculator?
There are many specific ideas for how we can do just that in the different articles I linked to at the beginning of this column. One strategy that makes particular sense to me is having students give a prompt to an AI platform and then explicitly annotate how they feel they can improve upon it.
However, that doesn’t mean that we have to “throw the baby out with the bath water.” In order for students to be able to adequately critique and improve upon AI, they will have to develop their own writing skills. The siren call of AI is going to be very tempting to many of our students, so to minimize its seductive attributes, we can use strategies like going back to handwritten essays in class; focusing on writing requiring students heavily reflecting on their own personal experiences or on specific activities and lessons done in class; and/or project- or problem-based learning that often relies on uniquely local situations. Of course, providing opportunities for students to write about topics of high-personal interest to them also might increase the odds that they won’t want AI to do the heavy lifting for them.
For those of us who teach English-language learners, AI tools create some specific challenges. Right now, it’s often an uphill battle to get students to not overrely on Google Translate when writing. If I had a dollar for every time I’ve said, “Use Google Translate for words, not for sentences,” I would, indeed, be able to retire early. Most of the time, their actions are not out of laziness—it’s out of fear of being wrong (no matter how welcoming and supportive their classroom and teacher might be). Though I think the Google Docs AI-powered Smart Compose feature of suggesting appropriate words benefits English acquisition, I am not looking forward to trying to convince ELLs that use of AI is not a strategy they should rely on.
One benefit to teachers using these AI technologies, though, will be their ability to generate student models—both good and bad ones. For example, ChatGPT quickly provided me with simple and accessible models for historical biographies and compare and contrast essays that I was able to share with my ELL U.S. History class. Links in the “Best” list I shared earlier in this response offered lots of other ideas on how teachers can generate decent materials to use in class that may just require minor tweaking.
I do agree with the decision by some districts, including where I teach, to block ChatGPT—for now. I just don’t think many teachers, including me, are adequately prepared to deal with this challenge. Such a blockage does raise equity questions since it means that students who only have laptop access through their school-issued devices won’t be able to use a blocked site like ChatGPT, and those with their own personal devices can. However, since it is in the middle of the school year, many of us are already familiar with the “voice” and style our students use, and so it may be easier for us to detect AI-generated text used for any assignments.
By next fall, however, I think we all need to be prepared to make those adjustments and face ChatGPT head-on. It’s a new paradigm, and we all better get used to it.
Samantha Parker has her M.Ed in instructional design and technology from Arizona State University and her B.A. in secondary English education from the University of Wisconsin-Green Bay. Currently, she is a high school English teacher at ASU Prep Digital, part of ASU Prep Academy:
I remember when I was a little girl and my grandfather took me to his workplace. He was an engineer who built and maintained paper mills. He gave us a tour of a building filled with machines that automated the paper-making industry; however, the highlight was when he asked if we wanted any popcorn. He took us to the break room and made popcorn—in a box, within minutes, no stove, no oil, no visible kernels. I was amazed. It was like magic. And now, just about everyone has a microwave in their home.
I also remember all the fun we had with Just Ask Jeeves when the internet went mainstream. These days, we can ask Siri or Alexa to play our music, make grocery lists, and even turn on and off the lights in our house. And now, we have ChatGPT by OpenAI that can write essays, create outlines, write lesson plans, and even write IEP goal rationales. Our K-12 schools are not immune to the world of technology crashing into their tried and true practices.
Lately, opinions on ChatGPT by OpenAI have inundated teacher social media platforms. Embraced by some and shunned by others. Education discussions have revolved around mastery skills, assessment, and 21st-century skills. Yet, much of the curriculum and the standards that drive them are based on 20th-century tools and ideas. ChatGPT is the perfect place for teachers, curriculum designers, and administrators to start in their quest for reform.
ChatGPT is a tool for us to start measuring the effectiveness of our assignments, especially the traditional class essay. Regardless of whether it is informative, persuasive, argumentative, or analytical, the essay has fallen under the formulaic spell of standardization. If a robot can proficiently write an essay or complete a written assignment, it’s time to change the assignment. And going back in time to in-class pen and paper writing tasks is not the answer.
Standards and assessments should align with the overall goal of why we teach writing and what students need to focus on to meet those goals. Schools spend years, nearly a decade, teaching each student to write essays. Yet, rarely is this a form of communication used beyond high school. It is time to realign writing standards to the writing skills that students will need post-high school. ChatGPT or other AI writing programs can actually help schools with this task. As writing and organizing tasks are automated, the same AI tools can help us to identify the communication and writing skills our curricula need to build upon. Again, if a computer can complete the task, what are the skills humans need to focus on.
I’ve fed ChatGPT many writing prompts since it became available. Overall, the results were less than stellar. The writing output from the program was basic at best, often redundant, and lacking any sort of author’s voice. Will it improve in time? Probably. This is, however, an opportunity to use ChatGPT with students to assess the intricacies of human writing, delineate what tasks can be used by an AI program, and demonstrate the importance and practice of academic integrity.
Not only can we show students how to use AI to help with organization and executive-functioning tasks but also how to help students improve their writing. Sharing an AI example with students and having them compare that with an exemplar student example could elicit a deep dive into the differences. This could be a great way to teach voice and fluency. Two writing skills that can be difficult to teach. Demonstrating how AI sources or does not source reference material segues into a discussion on plagiarism and copyright.
One complaint I have recently heard regarding AI writing programs is the bias that is often evident. Have students create examples using AI to explore and identify passages containing possible bias. Then discuss the bias that is in our own writing and methods to reduce.
No one should be surprised by ChatGPT as AI and machine learning have been hurtling toward K-12 education for a while now. Automation, AI, and other technologies are able to assist us with mundane and repetitive tasks. Preparing students to use and understand the benefits, drawbacks, and limitations of these technologies needs to be part of every high school curriculum.
Kelli McGraw and Nick Kelly are senior lecturers at Queensland University of Technology:
Since writing at the end of November about how AI could be used in the contexts of teacher education and lesson planning, our feeds are full of mentions about ChatGPT and how it can be used in education. For K-12 teachers getting excited over the holiday break about possible applications of new and existing AI tools, student reactions and access limitations in reality could be sobering.
Experimentation will be irresistible
Going beyond the social media chatter about how AI will shape the techno-landscape of the future, many educators globally and in all sectors will be unable to resist being creative with new tools in the present. Not all educators will be eager or ready to do this. But all K-12 teachers will from now on be faced with designing learning experiences for students who potentially have access to these tools at home.
The popularization of AI tools is going to keep causing a ruckus. It’s going to be like when we all got Encarta on CD for our new family computer. Or when study guides appeared on the internet and were still free and most teachers didn’t know about them yet. Or when Wikipedia turned up. Or when laptops (and later, BYOD schemes) were rolled out to schools. Or when YouTube opened the gate for educational video to be shared on social media. These are threshold moments for ed-tech.
The sudden, widespread interest in the use of AI tools for education means that some of our students, and/or their parents, will already be using or poised to use these tools. But which families? Which students? Which tools? And for what purposes?
Giving students a say
Knowledgeable and seasoned educators tacitly know they will have to meet their students where they are when it comes to adding AI tools into the learning mix. Student-centered approaches, including the practice of giving students a say in what they would like to learn, will be vital for success.
Educators can use AI tools in student-centered ways by placing student experience at the heart of their discussions about and experimentation with new tools. This includes:
1. Registration and Access
You must be 18 years or older and able to form a binding contract with OpenAI to use the Services. If you use the Services on behalf of another person or entity, you must have the authority to accept the Terms on their behalf…
It’s possible that, after all, students won’t be allowed to access these tools in school. For now. But just like Encarta, like study guides, like Wikipedia, like Youtube, and like mobile devices, AI tools will work their way into our teaching contexts. Continuing to teach media–information literacy and showing students that you also have the requisite literacies and ethical practices to use popular AI tools will go a long way.
The rise of AI calls us to value process alongside product
As many quickly pointed out, assessments that can be easily passed using AI assistance were probably already due for an overhaul. However, this has been true for some time due to the prevalence of contract cheating. Yet, significant changes to existing assessment paradigms are yet to be seen.
One solution to such threats to academic integrity is to increase the use of supervised exams as an assessment method. Another solution is to include the collection of process evidence in assessment tasks.
Creating assessments where the process is as valued and highly scrutinized as the product is not a new idea. Arts educators use tools such as the “process diary” to document and validate the creative process, and design educators regularly place more value on the design process than the design product. Inquiry–learning processes similarly have methods for recording, analyzing and sharing findings as they arise.
The existential crisis pervading the education field in response to ChatGPT goes to show how far we have yet to come in embracing process-based approaches in education, broadly speaking. But the strategies inherent in such approaches offer a way—for teachers and students—to use ChatGPT and other AI tools in transparent and ethical ways, as just another digital tool for learning and teaching, instead of framing it as an illicit cheat sheet.
Thanks to Samantha, Kelli, and Nick for contributing their thoughts!
Consider contributing a question to be answered in a future post. You can send one to me at [email protected]. When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.
You can also contact me on Twitter at @Larryferlazzo.
Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching.
Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all EdWeek articles, has been changed by the new redesign—new ones are not yet available). And if you missed any of the highlights from the first 11 years of this blog, you can see a categorized list below.
I am also creating a Twitter list including all contributors to this column.
The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.