Skip to main content Skip to secondary navigation

Join us in Academic Innovation for the Public Good

Register now for our online book conversation series with authors. Next event: May 15.

Chatbot guides students to learn and reflect

Main content start
Student interacting with a chatbot on his screen

Like all of us, teachers are bound by time and space — but can educational technology offer new ways to make a teacher’s presence and knowledge available to learners? Stanford d.school’s Leticia Britos Cavagnaro is pioneering efforts to extend interactive resources beyond the classroom. She recently has developed the “d.bot,” which takes a software feature that many of us know through our experiences as customers — the chatbot — and deploys it instead as a tool for teaching and learning. Britos Cavagnaro, PhD, is co-founder and co-director of the University Innovation Fellows, a program of the Hasso Plattner Institute of Design, known as the d.school, which empowers students to be co-designers of their education, in collaboration with faculty and leaders at their schools. Jenny Robinson, a member of the Stanford Digital Education team, discussed with Britos Cavagnaro what led to her innovation, how it’s working and what she sees as its future. What follows is an edited version of their conversation.

What inspired you to explore the potential pedagogical usefulness of bots?

Leticia Britos Cavagnaro
Leticia Britos Cavagnaro

Most learning happens in the 99.9% of our lives when we are not in a classroom. The COVID-19 pandemic pushed educators and students out of their classrooms en masse. From one day to the next, instructors had to figure out how to teach in a distributed and chimeric space, in which their home office — or kitchen, or living room — was connected to the many home spaces (or coffee shops) where the students could find access to Wi-Fi. It was a great opportunity to be creative and figure out how to activate in-context learning, taking advantage of the unique spaces where the students were, and the wide world out there.  

Well before this pandemic “exodus,” at the Stanford d.school we had experimented with different approaches and technologies to facilitate self-paced, asynchronous, in-context learning. See for instance this audio-guided immersive exercise in observation and synthesis. I call this “out-in-the-world learning.” 

Do chatbots have special qualities that are suited for out-in-the-world learning?

Chatbots have affordances that can take out-in-the-world learning to the next level. The most important of those affordances is that chatbots can respond differently to each learner, depending on what they say or ask, so the experience adapts to the learner. This can increase the learner’s sense of agency and their ownership of the learning process. 

In addition, the responses of the learner not only determine the chatbot’s responses, but provide data for the teacher to get to know the learner better. This allows the teacher to tweak the chatbot’s design to improve the experience. Equally if not more importantly, it can reveal gaps in knowledge or flawed assumptions the learners hold, which can inform the design of new learning experiences — chatbot-mediated or not. 

So if a chatbot works well, what’s the role of the teacher?

When using a chatbot, the gathering of data and feedback from the students happens in a way that is organic and integrated into the learning experience — without the need for separate surveys or tests. The data is captured digitally in a format that can be analyzed manually or by using algorithms that can detect themes, patterns, and connections. And all of this happens at scale. In effect the teacher can “interact” with and learn from multiple learners at the same time (in theory an infinite number of them). 

I do not see chatbots as a replacement for the teacher, but as one more tool in their toolbox, or a new medium that can be used to design learning experiences in a way that extends the capacity and unique abilities of the teacher. The same way that we may use Google Slides or Powerpoint to design a sequence of content to share with the students, or scaffold an experiential activity with instructions on the slides, we can design a chatbot-mediated conversation that guides the student in an exploration or in a reflection. 

Your bot, the d.bot, is a certain type of bot: a scripted bot. Describe what it does and where/how it’s being used.  

A scripted chatbot, also called a rule-based chatbot, can engage in conversations by following a decision tree that has been mapped out by the chatbot designer, and follow an if/then logic. In contrast, NLP chatbots, which use Artificial Intelligence, make sense of what the person writes and respond accordingly (NLP stands for Natural Language Processing). Based on my initial explorations of the current capabilities and limitations of both types of chatbots, I opted for scripted chatbots. 

What does rule-based chat look like, in action?

In the images below you can see two sections of the flowchart of one of my chatbots. In the first one you can see that the chatbot is asking the person how they are feeling, and responding differently according to their answer. As you can see, the answers are predetermined and encoded in the flowchart. When I started exploring different kinds of chatbots, I noticed that, even when a scripted chatbot was “putting words in my mouth” (or more accurately in my fingertips), it could still feel like a natural and empathetic conversation if it was well designed (I also experienced chatbots that were obnoxious and frustrating, by virtue of their poorly designed conversation flow). 

Flowchart showing a chatbot greeting a viewer

 

 

 

 

 

In this other segment of the chatbot flow, the two answers that are offered as options allow the person to ask for clarification if they don’t understand a term that’s being used (I anticipated this might be the case as it was a term that had been recently introduced in class). You can see that if they choose the “Synthesizing?” option, the chatbot takes a detour and explains what the word means, then continues on the same path that it would have taken if the answer chosen had not indicated lack of understanding. [Here's a demo of the d.bot.

Flowchart of a chatbot explaining a concept

 

 

 

 

 

 

 

 

 

 

You talk about the d.bot being “proudly artificial.” Why does it matter that the d.bot is identified as a bot, not a person? 

I borrowed the term “proudly artificial” from Lauren Kunze, the CEO of the chatbot platform Pandorabots. Simply put, the chatbot is making clear that it is not human. It would be unethical to use a chatbot to interact with students under false pretenses. It is very important that they understand from the beginning that they are not chatting with a human. At the same time, they should also be told who is the teacher who has designed the chatbot and, most importantly, that the information they share with the chatbot will be seen by the teacher. Depending on the activity and the goals, I often design the bot to ask students for a code name instead of their real name (the chatbot refers to the person by that name at different points in the conversation). I’m also very clear, through what the bot says to the user and what I say when I first introduce the bot, about how the information that is shared will be used. Oftentimes reflections that students share with the bot are shared with the class without identifiable information, as a starting point for social learning.

An interesting note related to this: California’s state legislature is the first to pass a law — B.O.T. (“Bolstering Online Transparency”) Act, SB1001 — that regulates the use of bots for online communications and prohibits them from pretending to be human (to reduce deceptive commercial practices and those which may unduly influence civic processes).

So honesty about the botness of the bot is good.

Yes. Making sure that your bot clearly communicates its artificial nature has important implications for the learning outcomes: research has shown that people are more comfortable sharing information and asking questions of an agent that they perceive as nonhuman. Angel Hsing-Chi Hwang and Andrea Stevenson Won, from Cornell University, found that people generated more and better ideas in a virtual brainstorming task when they perceived that they were interacting with a nonhuman agent, as compared to when they were interacting with a human collaborator. This makes sense in light of what is known about team dynamics and how people often refrain from contributing ideas out of fear of being judged by their teammates.

The d.bot has a distinct voice and personality; it’s playful, positive, and knowledgeable. It uses GIFs. Is there a sense in which a bot is like an avatar of the instructor? How does the projection of personality affect the ability of students to respond and collaborate?

I should clarify that d.bot — named after its home base, the d.school — is just one member of my bottery (‘bottery’ is a neologism to refer to a group of bots, like a pack of wolves, or a flock of birds). Over the past year I’ve designed several chatbots that serve different purposes and also have different voices and personalities. 

When designing a pedagogical chatbot (or any chatbot, for that matter), there are decisions that need to be made about its visible characteristics — like its name, its avatar, the kind of language and expressions it uses, whether it uses GIFs, recorded voice, etc. All of those details communicate different things to the users — explicitly and implicitly — and may affect how they relate to and respond to the bot. For instance, the name that you assign to the bot might create associations with a specific gender (and unleash implicit biases associated with that gender); the avatar could be anthropomorphic and that may predispose people to relate to it in a different way than if it looked robotic or abstract. So it’s important to be intentional in those choices. 

SPACE10 (IKEA’s research and design lab) published a fascinating survey asking people what characteristics they would like to see in a virtual AI assistant. Beyond gender and form of the bot, the survey revealed many open questions in the growing field of human-robot interaction (HRI). 

I was intrigued by the way that the bot is different from, say, a list of instructions. A bot can guide a student, or a group of students, through a complex process in a way that is immediate in time, that provides pertinent information to students at the moment it’s needed. Are there some types of processes for which the use of a bot could be especially productive or valuable?

In general, when an activity is best done in a self-paced way and outside of the classroom, a chatbot is now my default medium to create that learning experience. Here are a few of the use cases I’ve explored:

  • A reflection coach, which I call “Rebot”: Rebot shares with learners a prompt related to a specific topic — it could be an account of something that happened in class, or a segment of a podcast or other resource that relates to the topic at hand — and guides learners through a structured reflection on it. This bot-guided reflection usually happens outside of the classroom, at a time, place, and pace that works best for the learner. While the reflections are usually for the learners’ eyes only, the Rebot may ask the learner to answer an open-ended question with an insight that they are comfortable sharing — anonymously or not — with the class, to promote social learning.
  • A guide for team formation, which I call “Teambot”: When I create student teams for projects, I use Teambot to lead them in an activity in which team members answer a series of progressive questions with the aim of developing psychological safety. It helps them to collectively reflect on how they will work as a team. In the past I’d facilitate this in the classroom synchronously, which is not ideal because different teams move through the questions at different paces. Teambot deploys each question just in time, as the team is ready to answer it. 
  • A tour guide, which I call “Spacebot”: A few months ago I was asked to give a tour of the d.school to a group visiting from a university. As I happened to be out of town, I designed a chatbot to guide the visitors through the space, pointing out different elements of its design, asking questions about their impressions and thoughts, and sharing more information according to their interests. That group liked the chatbot-guided tour so much that I’ve continued to use it for other groups. I also designed another version of it for our workshop for faculty educators. We traditionally have them explore the d.school building and reflect on how they can use physical space as a lever to create transformative learning experiences. Spacebot is now helping our team with that. 

What’s the next step you’re working on? 

There’s a lot of fascinating research in the area of human-robot collaboration and human-robot teams. One article I read recently that is particularly relevant to higher education showed how a conversational agent (via chat) improved the outcomes of student teams completing a CS task by making interventions that helped students make connections between their contributions and those of their teammates, or contrast their responses using appropriate academic discourse.

I believe the most powerful learning moments happen beyond the walls of the classroom and outside of the time boxes of our course schedules. Authentic learning happens when a person is trying to do or figure out something that they care about — much more so than the problem sets or design challenges that we give them as part of their coursework. It’s in those moments that learners could benefit from a timely piece of advice or feedback, or a suggested “move” or method to try. So I’m currently working on what I call a “cobot” — a hybrid between a rule-based and an NLP bot chatbot — that can collaborate with humans when they need it and as they pursue their own goals. You can picture it as a sidekick in your pocket, one that has been trained at the d.school, has “learned” a large number of design methods, and is always available to offer its knowledge to you. 

What would you recommend to teachers who are interested in exploring how they can use a bot to support instruction and learning?

Here are a few ideas to get started:

  1. Check out a simple rebot (reflection bot) that is a derivative of one I designed for our faculty workshop. It may give you a sense of how a scripted bot designed for learning looks like, and it may spark some ideas for how you may use one. 
  2. Also, explore NLP bots, as their capabilities continue to improve (and at a rapid pace). Try kuki.ai, or get an account at OpenAI and play with the example chatbots available in their playground (for instance, compare the “friend chatbot” with “Marv the sarcastic chatbot”).
  3. Dive in and try your hand at designing a chatbot of your own. Landbot, the platform I use, has a free trial period, and also special pricing for NGOs/educational orgs. It’s also easy to learn
  4. Check out this short selection of resources that I came across as I started my own exploration: 

 

Published October 5, 2022


Follow Stanford Digital Education: Sign up for Stanford Digital Education's quarterly newsletter, New Lines, to learn about innovative research and teaching in digital spaces, as well as our team's initiatives. Subscribe to New Lines.