Now that AI has arrived on the scene everywhere in schools, subtle nuisances about its presumptive effects are beginning to reveal themselves.
A study focusing on children aged three to five examined their trust in information provided by humans and robots. It found that younger children tend to trust humans more than robots, even if the human has been inaccurate in the past.
Wow. I’m glad I’m on Team Human, although it doesn’t mean teaching older kids is any easier.
Older children, you guessed it for all you parents of teens out there, show equal distrust towards unreliable information from both humans and robots. This research sheds light on the development of trust and social learning in the digital age, highlighting the need to consider children’s developmental stages in designing educational robots and AI tools.
It also reveals that robots aren’t necessarily going to be looked at as the “be-all-end-all trustworthy” solution for learning success.
From a practical standpoint, this study is instructive to middle and high school leaders who oversee faculty charged with inspiring critical thinking, social emotional learning and problem-based learning.
The best way to address this inherent mistrust head on is to allow room for everyone to have a voice in a conversation and encourage critical thinking at every turn.
Classroom conversations encourage students to question, analyze, and synthesize information, fostering a deeper understanding and acceptance of what they learn.
Conversations – not chat bots – are the shortest paths to creating the sense of trust and reliability in information that is needed for learning.
Listen to the Teach Different Podcast and experience a conversation that models a technique that makes unforgettable conversations routine.
Picture by Dall-E.