Skip To Main Content
No post to display.
AI Symposium: Mining the Mind 

Annie Murphy Paul, the keynote speaker at the third Artificial Intelligence (AI) Symposium hosted by Loomis Chaffee, expanded the mind by talking about her book, The Extended Mind. 

More than 200 teachers and administrators — including about 80 from outside schools — attended the conference on Tuesday, June 3. More than 30 schools were represented at the symposium, which included 25 workshop options for participants. 

Ms. Paul’s book posits that our minds aren’t confined to our brains but extend into our relationships with people, the world around us, and technology. That concept resonated with Loomis Chaffee English teacher Courtney Jackson. 

“I appreciated that she emphasized the importance of our humanity, and the ways in which we already extend our thinking outside of our minds, and AI is just another iteration of that,” Courtney said. “I've been pretty resistant to [AI] as a humanities person, but I think it helped me enhance or evolve the way I thought about it as a tool. And some of the things she mentioned are little nuggets that I can use to frame AI with my students, so I found that helpful as well.” 

Little nuggets. They were no doubt present in all the workshops. Participants were given notebooks to jot down ideas as they moved through sessions. 

“AI challenges a lot of what all schools do, so there is a community for this,” said Matt Johnson, who helped organize the symposium and is the assistant director for academic technology in the school’s Kravis Center for Excellence in Teaching. “People are trying to think through this in creative, thoughtful ways that align with their values, so this symposium is trying to think about [AI] in a way that is authentic to us, in our classrooms ... and also clear ways of not using AI.” 

AI keeps evolving. Maya Bialik, one of the presenters in an afternoon session, helped author a 2019 book on AI. Another book is upcoming. After all, much has changed in the AI world in the last six years.  

“How To Define and Draw Clear Lines for Appropriate AI Use” was the title of the session led by Ms. Bialik and Peter Nilsson. Ms. Bialik founded QuestionWell, an app that helps teachers use AI in their lesson preparation. Mr. Nilsson curates education-related news and produces the Educator’s Notebook, and he founded Athena Lab, where teachers can collaborate and share practices. Their session discussed ways that teachers can help students determine when they should and should not use AI for learning.  Whether AI should be used depends in part on the learning objective of the particular lesson, the presenters said.  

Kate Seyboth, the director of digital and computational learning at Loomis, led a session called, “Better Together: How Human Connection Transforms AI into an Effective Educational Partner.” As the description of her session noted, “AI tools have enormous potential in education, but many educators hesitate to fully embrace them due to concerns that they may hinder rather than enhance deep learning.” 

At the beginning of the session Kate asked participants what they saw as the upsides and downsides to AI. The proffered downsides: creating lazy thinkers, not checking the accuracy of information from AI, not having the intuition to feel something isn’t right, prioritizing answers to questions rather than understanding the issue, and delivering an overload of information. The suggested upsides: generating practice problems that are helpful learning tools, summarizing concepts, helping in brainstorming and feedback, serving as a useful tutor. 

As one teacher said, she likes the idea-generation part but is not so keen on the feedback part. She felt feedback should be person to person.  

Jen Solomon, the associate director of innovation at Loomis Chaffee, led a workshop about design thinking and brainstorming tools that tap into the creativity of individuals and groups.  “In a world where AI can generate ideas at the click of a button, it's more important than ever to empower students to think creatively and independently,” the workshop description posited. 

Jen projected a quote from cognitive scientist Daniel T. Willingham: “Memory is the residue of thought.”  

“If my goal is for [students] to learn something, I want them [the students, not AI] to have thought about it,” Jen said. “But I'm also all for them outsourcing things that aren't necessarily associated with my learning goals. I'm trying to figure out where do we find the balance — sometimes I use my GPS and sometimes I drive with my eyeballs and my brain. I need to be able to do both. So when am I using Chat GPT or Co-Pilot to come up with 10 ideas on something and I'll choose one versus when am I coming up with the ideas myself?” 

Jen said she read an article that made her think, “If my husband wrote my birthday card using Chat GPT, would I be OK with that? Or if it expresses his real feelings, does it matter? So what is my personal litmus test?” 

These are some of the many questions educators are grappling with in today’s AI world. 

We asked Co-Pilot, “Are teachers afraid of AI?” The response, in part, was, “Ultimately, most educators are figuring out how to adapt — some with hesitation, some with excitement.” 

We really didn’t need AI to tell us that, though. After all, we are human. 

  

 

 


 

More News & Stories

Check out the latest Loomis Chaffee news.