During a recent discussion, a colleague asked a retired educator: “What are the two feelings you most want your students to have toward you to be effective?”
The educator answered immediately: “Confidence and trust. Not engagement. Not motivation. Not achievement. Confidence and trust.”
That response points to key challenges educators and institutions face with AI in education. Amid ongoing discussions about tools, assessment, integrity, frameworks, ethics, and skills, a quieter but crucial question emerges: where do students place their confidence, and who do they trust when they are uncertain?
AI meets learners at these moments of uncertainty, adapting explanations instantly, noticing patterns in mistakes, and providing patient, judgment free feedback.
But who learners trust first when they are uncertain can change everything.
Dr. Stephen Murgatroyd puts it this way:
“AI is no longer a peripheral instructional aid but a persistent cognitive companion. What matters is not that students use AI, but that it increasingly mediates uncertainty, sense making, and confidence at the very start of learning, before institutional actors ever enter the frame. This challenges long standing assumptions about where learning happens and who holds authority.”
Dr. Stephen Murgatroyd
Globally recognized scholar and strategist in higher education, futures, and digital transformation
This framing invites a different way of thinking about how early learning momentum is established, particularly in relation to the role institutions have traditionally played.
The idea of confidence forming early is unsettling, not because it is wrong, but because of how quietly it happens.
Dr. Divya Singh outlines the risk this creates for higher education:
“In our reality, where students enter higher education having already learned to rely on AI tools and GPT as primary information sources, university academics must consciously focus on learning engagements that induce students to think independently and expressively. To continue otherwise destroys the critical need for opinion plurality and the recognition of difference, thereby undermining the fundamentals of the social construct. For student generations growing up in data poor regions, there is an additional risk of being recolonized by the global north, whose voice is embedded in most GPT models. Thought homogenization is a future simply too ghastly to contemplate.”
Dr. Divya Singh
Chief Academic Officer, STADIO Holdings
This raises a subtle but important question: whose voices do learners internalize before they encounter the full diversity of academic perspectives?
But trust is not only about whose voices are heard. It is also about who learners actually believe when deciding whether an answer is good enough to move on.
Stephen Downes puts it plainly:
“Nobody cares where your calculator got the information that 2 + 2 = 4, so it does not matter if you ask the calculator first. But people do care where a story of personal experience comes from. Do not ask your calculator what it is like to think mathematically. Ask a mathematician instead. Educators need to engage with this distinction as though it were a matter of life and death for the profession.”
Stephen Downes
Researcher studying the intersection of digital technology, media, education, and philosophy
Learners often begin to form trust in the moments between working through problems and making sense of concepts, before they fully recognize the differences between types of tasks.
This is not just theory; it is a habit beginning to take shape. When learners repeatedly turn to the same place when they are unsure, that place starts to feel reliable. Over time, judgment does not just get supported there, it begins there. Often, this happens before learners turn to their teacher or institution.
Dr. Asha Singh Kanwar points to what this means for institutions:
“While AI can be considered an integral part of our learning ecosystems, it is important for institutions to recognize and deliberately shape its role as a ubiquitous cognitive companion that learners increasingly encounter first. AI can be a powerful tool for promoting learnability, but institutions need to invest in the AI literacy of teachers to empower students at the point where reliance on AI is first forming.”
Dr. Asha Singh Kanwar
Chair Professor, Beijing Normal University
Confidence in learning encompasses both the perception that educators have learners’ best interests at heart and recognition of learners’ challenges, including mental and emotional well-being.
Research suggests that confidence is not primarily cognitive but emerges from relationships between teachers and learners.
How confidence develops in learning contexts increasingly mediated by AI remains an area for further study.
Confidence without accompanying support may not transfer effectively to educators, highlighting the importance of intentional instructional strategies.
AI has entered the space where confidence and trust often begin.
The question of education may no longer be whether learners use AI. Perhaps it is whether institutions and educators are paying attention, watching where trust starts to form, and deciding what must still remain unmistakably human.
Confidence and trust appear to be central learning outcomes that education must safeguard, even as AI increasingly mediates early learner experiences.


