Teaching students how to learn (with and without AI)


Learning How to Learn (with AI, Actually)

I wrote the first draft of the “Using AI as a Tutor” chapter in the forthcoming Norton Guide to AI-Aware Teaching, co-authored with Annette Vee and Marc Watkins. I pitched this chapter for the book because I was brought into the author team as the “STEM guy,” that is, a co-author who could bring some STEM education perspectives to the work, and because the number one use case of generative AI in STEM education that I hear about is students turning to AI for help understanding course content.

Students using AI chatbots like ChatGPT or Gemini as some kind of synthetic tutor happens outside of STEM fields, of course, including in writing instruction, where my co-authors do most of their teaching. I’ve been drawn to Anna Mills’ notion of an ethical tutor, which she writes about in her book AI and College Writing: An Orientation:

Generally speaking, tutors expect that student work should be student ideas and student words. They don’t complete part of the assignment, but they do give feedback, examples, and explanations. Tutors get training to help them make the call on gray areas: is it okay to suggest a new outline for your essay or should they ask you questions to help you come up with one on your own?

Mills points out that when a student interacts with an ethical human tutor, the tutor can set useful boundaries between helping and overhelping, but when students turn to an AI chatbot as a synthetic tutor, it’s generally up to the students to set those boundaries. That’s largely due to the sycophancy problem that AI chatbots have–they’re generally designed to be super helpful in order to keep users using them.

It’s altogether too easy for a student stuck on a physics problems to turn to ChatGPT to ask for help and receive a full and complete AI-generated solution to the problem. Whether or not that solution is correct (AI’s other big problem in the learning space is accuracy), a student receiving that solution is robbed of the opportunity to do the hard work of learning, the kind of work that an ethical human tutor would support through good questioning.

As I started brainstorming the AI-as-tutor chapter, I knew I wanted to share some examples of AI prompts that would help students turn their favorite AI chatbot into a more helpful tutor, one that would simulate the kinds of questions and suggestions that an ethical human tutor might use. My assumption is that students are going to turn to AI chatbots for this kind of help, so we might as well provide students with prompts to get more out of those interactions.

If you go searching for prompts of this sort online, you’ll find lots of collections offered by higher ed faculty and others. While writing the chapter, I started looking through some of these collections of tutoring prompts and, as I do, I started looking for patterns and putting prompts in categories. As I did, it became clear that the better prompts for using AI as a tutor all aligned with some element of what cognitive science tells us about how learning works.

For example, here’s a suggested prompt from Cassandra Silva Silibin, who teaches philosophy at the City University of New York:

Ask me to name a topic in [insert course name here] where I need an explanation. Then ask me about some of my hobbies. Then explain the topic using an analogy that draws from my hobbies.

We make often make sense of new information by connecting it to something we already know, leveraging our existing mental models of how the world works. Silibin’s prompt is asking the AI chatbot to help this process by drawing an analogy between the new topic of study and the student’s existing mental models of whatever hobbies they’re into.

Or consider this prompt shared by Emily Schwitzgebel of Northwestern University:

I am going to be the teacher, and you will be the student. As your teacher, I am going to tell you everything I know about ___. Once I’ve given you this information, I’d like you to summarize what I’ve told you. Once that is done, ask me if I'd like to share any additional information or need to correct any part of my summary.

We know that one way to learn something is to teach it to someone else. In Small Teaching: Every Lessons from the Science of Learning, James Lang calls this idea explaining and it’s a learning strategy we instructors often leverage in group work, class discussions, and student explanations. A student can use the prompt here to have their AI chatbot become a simulated learner so the student can hopefully find their way to understanding through explaining.

In the AI-as-tutor chapter, we share a few more examples. You can probably imagine the kinds of AI prompts that generate opportunities for retrieval practice or help students build their knowledge organizations through simulated conversation.

I was reminded of this connection between more useful AI tutoring prompts and cognitive science earlier this week when I listed to Lucy Appert’s appearance on the Dead Ideas in Teaching podcast from Columbia University. Appert directs a teaching center at New York University and hosts the What Learning Looks Like podcast.

In her Dead Ideas interview, Appert pointed out that AI vendors who target the education market often give students the wrong idea about learning, specifically that learning should be frictionless and efficient and AI can help students learn in those ways. “Anyone who has been an educator,” Appert said, “all knows that there’s no evidence to support these claims.” But students don’t all know this because educators have, as Appert put it, “a transparency problem.” That is, we know how learning actually works but we don’t teach our students how learning actually works. As a result, our students are likely to fall for the false promises of certain AI vendors.

I have been increasingly convinced that we need to do a better job teaching our students how learning works. When I had Mary-Ann Winkelmes, director of the Transparency in Learning and Teaching (TILT) project on my podcast in 2023, she argued that even when students come to college with general study skills, they often lack discipline-specific study skills. That argument was bouncing around my head last October when I wrote about teaching students a variety of reading strategies so they would be in a better position to use AI to support and not undermine their reading. And the What Learning Looks Like podcast is all about making the mechanisms of learning more transparent for both instructors and for students.

What if we could use our students’ willingness to turn to generative AI to get tutoring help as an opportunity to teach students something about how learning actually works? What if by providing students with AI prompts like the ones above–along with some explanation of the learning principles those prompts leverage–we can better equip our students to be learners, both now and in the future?

Sharing a few well-designed AI prompts with students won’t turn them into expert learners, of course, but I think there is a useful way to frame students’ use of AI around the adoption of better learning strategies. Do you take this approach with your students? If so, I’d love to hear about it.

Moving Student Writing into the Classroom

Last November, University of Virginia English professor James Seitz offered a workshop through the UVA Center for Teaching Excellence titled “Teaching in the Age of AI: How Students Can Do All Their Writing in the Classroom.” When I saw the workshop announcement, I have to admit that my initial reaction wasn’t a positive one. Was this another call to return to the days of blue books, with high stakes essay exams depending on students being able to practice the lost art of handwriting? I quickly learned that wasn’t the case! Jim’s students are welcome to write using their laptops during class, he just asks them to turn their Wi-Fi off and put their word processing program in focus model.

I was excited to have Jim Seitz on the podcast last week to share how he has moved the writing his students do into the classroom. This move is a response to generative AI’s disruption of writing instruction, yes, but it’s also the latest in a series of teaching choices Jim has made to teach his students writing as a way of thinking and to change their relationship with writing. Jim takes a very thoughtful and intentional approach to his in-class writing days, as you’ll hear in our conversation.

What really convinced me of the value of Jim’s approach was the response he’s received from his students. Not only did they appreciate not having assignment deadlines looming over them for days or weeks, they also greatly appreciated the distraction-free environment Jim creates for them during class. “I remember this one student saying,” Jim told me, “you know, in the course of writing my first paragraph, I would check my phone 20 times. So I can't get any momentum going.”

You can listen to my conversation with Jim Seitz here, or search for “Intentional Teaching” in your podcast app.


Intentional Teaching with Derek Bruff

Welcome to the Intentional Teaching newsletter! I'm Derek Bruff, educator and author. The name of this newsletter is a reminder that we should be intentional in how we teach, but also in how we develop as teachers over time. I hope this newsletter will be a valuable part of your professional development as an educator.

Read more from Intentional Teaching with Derek Bruff

Reimagining Grading with Sharona Krinsky and Robert Bosley This past December, I had the honor of being a guest on the Grading Podcast ("reimagining grading as a tool for student success") hosted by Sharona Krinsky and Robert Bosley. We had such a great conversation that I thought I would return the favor and invite Sharona and Boz on my podcast. Sharona Krinsky is the executive director of the Center for Grading Reform, a non-profit that hosts an annual conference on grading, among other...

Headshots of guests Windy Frank and Sarah Gibson, with their names and the Intentional Teaching logo beneath

Student-Designed AI Chatbots I've heard a lot of different objections from students to bringing generative AI into the classroom, but there was one I hadn't heard until I talked to my friend Windy Frank, who teaches in the College of Bible at Lipscomb University here in Nashville. She asked her students to design custom AI chatbots based on figures in the Old Testament such as Jonah (with the whale) and Daniel (with the lions). Her goal was to motivate students to study relevant primary and...

Hi friends, Many of you know I'm based in Nashville, and we had an epic ice storm last weekend. My power went out Sunday morning and didn't come back on until Thursday afternoon. And I'm one of the lucky ones! There are still 80,000 households in Nashville without power. And other parts of the Southeast were hit at least as hard... I know my friends in Oxford, Mississippi, are struggling. Please send your warm thoughts our way! All that to say, there was no podcast episode this week, and I'm...