Three laws on curriculum design in the AI era (opinions)

Almost one-third of students report that they don’t know how or when to use generative AI to help with the course. On our campus, students tell us they are worried that if they don’t learn how to use AI, they will be left behind in the workforce. Meanwhile, many students worry that technology will disrupt their learning.
Here is Gabby, an undergraduate student on our campus: “It makes my writing something I don’t say. It makes it hard for me to think of my own thoughts and makes everything I think disappear. It replaces it with something formal. It’s right. It’s hard for me to disagree with it once it’s said.
Students are more anxious about the allegations of unauthorized use of AI tools – even if they don’t use them. Here is another student: “If I write like myself, I’ll get points for not following the title. If I fix the grammar and follow the template, my teacher will look at me and assume I’m using chatgpt because the brown people don’t write well enough.”
Teacher coaching in the classroom is crucial to addressing these issues, especially as campuses increasingly provide students with opportunities for corporate GPT. Our own campus system, California State University recently launched an AI strategy that includes a “landmark” partnership with companies like OpenAI, as well as a free subscription to GPT EDU for all students, faculty and staff.
Perhaps no surprise, students aren’t the only ones who are confused and worried about AI in this rapidly developing environment. Teachers are also confused about whether and under what circumstances can and under what circumstances do students use AI technology. In our role at the Center for Equity and Excellence (CEETL) at San Francisco State University, we are often asked about the importance of campus-wide policies and tools like Turnitin to ensure academic integrity.
As Kyle Jensen noted in an American university event at a recent American university and pedagogy event, workers in higher education are experiencing a lack of coherent leadership around AI and facing many requirements about teacher and administration time. Paradoxically, teachers are all very interested in the positive potential of AI technology and insist on the need for some kind of responsibility system to punish students for unauthorized use of AI tools.
Teachers need to clarify the role of AI in the curriculum. To solve this problem at CEETL, we developed what is called the “Three Curriculum Laws of the Age of AI”, the work of Isaac Asimov’s “Three Robot Laws” to ensure that humans are still in control of the technology. Our three laws are not laws per se; they are frameworks for thinking about how to address AI technology in every level of course, from classroom to degree-level road maps, from general education to graduate programs. The framework is designed to support teachers’ efforts in the challenges and commitment of AI technology. The framework reduces teachers’ cognitive load by connecting AI technology with familiar design and revising courses.
The first law involves students’ understanding of AI, including tools and the way their society, culture, environment and workforce work; potential biases; tendencies to hallucinate and misinformation; and tendencies to focus on ways of understanding, reasoning and writing in Western Europe. Here we rely on critical AI to help students apply their critical information literacy skills to AI technology. Consider how students can be taught to align with core equity values in our colleges and take advantage of teachers’ natural doubts about these tools. This first law (students on teaching AI) bridges AI enthusiasts and skeptics by rooting AI’s AI approaches with familiar and broad values of fairness and critical approaches.
The second part of our three legal frameworks asks students what they need to know to work ethically and equitably with AI. How should students use these tools as they increasingly embed in the platforms and programs they already use, and they incorporate students into the jobs and careers our students want to enter? As Kathleen Landy asked recently,,,,, “We want students to take our academic programs[s] Know and be able to work with (or not) with Generative AI? ”
The “having” part of our framework supports teachers to start revising learning outcomes, assignments and evaluation materials to include AI use.
Finally, perhaps most critical (and related to “none” in Landy’s question), what skills and practices students need to develop No To protect their learning, AI prevents lines and centers on their own culturally diverse perceptions? This is a famous quote from the University of Washington Teaching Center:
Sometimes, students must first learn the basics of the field to achieve long-term success, even if they may later use shortcuts when using more advanced materials.
Robots sound authoritative, and because they sound so good, students will feel persuasive, leading to situations where robots transcend or replace students’ own thinking; therefore, their use may reduce the way students develop and practice thinking in many learning goals. Protecting students to learn from AI helps teachers focus on academic integrity in terms of the curriculum rather than on testing or policing students’ behavior. It invites teachers to think about how they redesign tasks to provide students with their own thinking space.
Given that AI tools available to students are everywhere, providing and protecting such spaces undoubtedly poses more challenges for teachers. But we also know that protecting students from simple shortcuts is at the heart of formal education. Consider a plan to determine whether the assessment should be an open book or an open record, take home or class. These decisions are rooted in the Third Law: What best protects students from shortcuts (e.g. textbooks, access help), thus undermining their learning?
University websites are mastering resource guidelines for researching new technologies. Teachers can be overwhelming to say the least, especially given the higher teaching burden and limitations of teacher time. Our three legal frameworks provide scaffolding for faculty and staff as they sift resources on AI and start redesigning assignments, activities and assessments to address AI’s work. You can see our three action laws here, a live notes from Jennifer redesigning her first-year writing course to address the challenges and potential of AI technology.
In the spirit of connecting new things to familiar people, we will remind readers that while AI technology poses new challenges, these challenges are no different in some ways from the work of curriculum and assessment design, which we conduct regularly when building curriculum. Indeed, teachers have long been addressing the questions raised by our current moment. We will leave you this quote from the articles written by Gail E. Hawisher and Cynthia L. Self on the rise of word processing techniques and writing research:
“We do not advocate giving up the use of technology, relying primarily on scripting and printing without the need for word processing and other computer applications (such as communication software); nor do we recommend eliminating the description of the active learning environment that technology can help us create. Instead, we must try to use awareness of differences to achieve our awareness to achieve our attitude to build the technology and build the image of the technology and build it into a part of the image. Avoid the necessary key perspectives of the use of computers to promote or promote balanced and increasingly important perspectives.