Skip to main content Skip to secondary navigation

Join us in Academic Innovation for the Public Good

Register now for our online book conversation series with authors. Next event: May 15.

Main content start

Artificial intelligence: What happens now for education

A panel of Stanford experts offer guidance on how educators should approach new generative AI tools such as ChatGPT.
Stanford graduate student Parth Sarin, Professor of Education Sarah Levine and Josh Weiss, Graduate School of Education director of digital learning solutions, at a March 2, 2023 panel on AI in education.
Stanford graduate student Parth Sarin, Professor of Education Sarah Levine and Josh Weiss, Graduate School of Education director of digital learning solutions, at a panel on AI in education. Photo by VPGE/Alex Ramirez Gillaspy.
Embed Code

The natural language AI tool ChatGPT, trained on vast amounts of data and then refined by human feedback, has helped to ideate and launch businesses, write code, plan parties, and author everything from a vacation itinerary to a play. It even passed an MBA exam at Wharton. This and other generative AI tools are certain to disrupt education, but does this technology really threaten to transform education as we know it? What does the launch of ChatGPT and technologies like it mean for educators at Stanford right now? 

A March 2 panel discussion, Artificial Intelligence in Education, explored these and other questions. Organizers from the Stanford Academic Technology Community of Practice brought together faculty, staff, and students to consider what new generative artificial intelligence technology could mean at Stanford, and for the future of education. 

The panelists held deep expertise in both education and AI technologies. Sarah Levine, assistant professor at the Stanford Graduate School of Education, researches ways that AI can assist in teaching reading and writing to middle and high school students.

Parth Sarin is an MS student in computer science and instructor for the course "The AI Toolbox: An Everyday Guide," which will be offered later this year in Title I high schools across the country through a partnership between Stanford Digital Education and the National Education Equity Lab. Josh Weiss, director of digital learning solutions at the Office of Innovation and Technology at Stanford Graduate School of Education, works to develop and iterate digital learning experiences. Increasingly, these projects involve generative AI and its capacity for co-learning and co-making. 

Panelists summarized how ChatGPT and other generative AI technologies already offer opportunities in teaching and learning. Through summarizing and synthesizing content on a topic, ChatGPT could fundamentally change learning, helping people to understand new topics more quickly. It is particularly helpful with reading and understanding content that is jargon-filled or complex. Similarly, AI tools can assist in parsing dense research data or digging through academic writing to extract meaning.

Panelists offered additional examples of ways AI improves learning: brainstorming thesis statements for essays to help students get started on a writing assignment; as an informed sparring partner in debate; and comparing opposing interpretations of a text. 

For educators, as in other fields of work, AI could offer a productivity boost. Weiss noted AI’s capabilities to “supercharge productivity and creativity” in both the workplace and academia. AI can also speed up some parts of educators’ work — for instance, developing discussion prompts and building quizzes — to allow more time for educators to provide deeper feedback or other personalized attention for learners. 

At the moment, however, this supercharge is only available to those who have access to the tool, and are educated in its effective use. “Artificial intelligence can offer an equity lift  — if we design well,” remarked Weiss, noting the already-widening gap between the “tech haves and have-nots” in our society. 

To narrow that gap, education on how best to leverage the power of AI technologies is needed. Already, learners need sharp information literacy skills to locate and identify real, authoritative sources of information across the vast oceans of what is available. Now, new AI tools pose a threat of increasing misinformation if the content it generates isn’t verified. “We need to prepare students for spotting incorrect information,” said Levine. 

To help students understand the limitations of the technology, Levine suggested offering chances for students to discover a major limitation of ChatGPT: it will provide fully fabricated responses to questions it knows nothing about. “Have students start personal,” Levine recommended, “and allow them to see how willing ChatGPT is to make stuff up.” By doing so, students will be more aware of the importance of vetting and editing AI-generated content to avoid presenting wholly incorrect information as valid.

AI can help overcome some bias in education, but has the potential for introducing bias as well. AI offers the ability to scale feedback, providing leveling across student feedback; this feature is particularly helpful, for instance, when there are multiple graders across a course, as it can help overcome any bias among graders. It could be used to reduce bias in other areas of higher education, like college admissions. However, the data AI is trained on itself contains inherent bias; therefore, AI’s output must be carefully reviewed for indicators that bias is filtering into the content generated.

“It’s drawing on ideas from the existing social system,” said Sarin, explaining that it inherits the biases and inequities from that system. The solution can be technical, at least in part: for example, look for good data to train the AI. However, the most effective way to manage AI’s bias is through education. “Bring it to the classroom, and surface the bias,” Sarin suggested, providing information about the AI’s shortcomings and inviting students to imagine solutions. 

Despite the excitement about the possibilities of AI in education, panelists underscored the importance of the human aspect of learners’ success. To them, there is no replacing the human instructor. “Students don’t love and trust a robot like they love and trust their teacher,” commented Sarin. Artificial intelligence is a capable and powerful tool for teaching, learning, creativity, and productivity, but it is not yet good at judgment. Weiss said that while it’s a bottomless well of ideas, the tool must be guided and managed to perform. 

ChatGPT, which was introduced last November, and other generative AI tools are still very new.  As one audience member remarked, this group may need to reconvene for another panel discussion in six months. Indeed, ChatGPT-4, was released March 14, about two weeks after the panel, and has drastically improved from the earlier version. According to a white paper from OpenAI, ChatGPT-4 performs extremely well on tough exams such as the GRE and LSAT.

So what can educators at Stanford do? Groups across Stanford have already developed practical overviews of the topic and toolkits for managing AI in the instructional environment. More events like the March 2 panel and the recent AI+Education Summit will bring diverse groups together to address the subject collaboratively. Additional practical workshops are being developed by the organizers of this panel on topics like how to design course activities that leverage AI.

The Stanford Academic Technology Community of Practice organized the hybrid event in partnership with the Stanford Graduate School of Education, Stanford Accelerator for Learning, the Center for Teaching and Learning, and Stanford Digital Education. It was part two of a series. The organizers hope to continue coordinating dialog to advance understanding of emerging technologies among scholars and practitioners alike.

To learn about future events, email digitaleducation@stanford.edu.

Published March 22, 2023

Cindy Berhtram is associate director of project management.


Follow Stanford Digital Education: Sign up for Stanford Digital Education's quarterly newsletter, New Lines, to learn about innovative research and teaching in digital spaces, as well as our team's initiatives. Subscribe to New Lines.