AI in Higher Education
Student Experiences with AI Learning Tools
What works, what doesn't, and what students actually want
The Context
AI tools are flooding into higher education, but we know surprisingly little about how students actually experience them.
92% of UK undergraduates now use AI tools—up from 66% just one year prior. Yet universities struggle to provide clear guidance, and educators often choose AI tools with limited insight into what students find genuinely helpful.
We designed a study to find out: across different AI formats and subject areas, what do students actually value?
The Study
21 university students completed three one-week modules—cognitive psychology, nutrition science, and computer programming—each featuring a suite of AI-powered learning tools.
The AI tools we tested
QuizBot
Retrieval practice with AI-generated questions
Coggie
Socratic dialogue tutor
Interactive Activities
Hands-on exercises and visual demos
Client Simulation
Voice-based counselling practice
ExampleBot
Real-world application generator
Reading Assistant
On-demand concept clarification
AI Lectures
Avatar-delivered video content
AI Podcasts
Conversational audio summaries
Surveys
Ratings on effectiveness, engagement, motivation, satisfaction, and future use after each module
Focus Groups
Semi-structured discussions exploring experiences, preferences, and broader attitudes toward AI
The Clear Winner
Interactive tools that required active participation consistently outperformed passive formats.
"With the tools where you would have to respond... I was actually doing something... like active learning. But if it was something like the lecture or the podcast, I would find myself zoning out."— Study participant
"The AI lecture was robotic and monotonous... there's an uncanny valley effect. In person lectures can be funny, boring, sometimes even start singing—there's unexpected elements."— Study participant
What Students Told Us
Focus group discussions revealed nine themes about how students experience AI in education:
Passive formats lack connection
AI lectures triggered "uncanny valley" effects. Missing spontaneity and human warmth made them hard to engage with.
Interactive tools foster reflection
Tools requiring responses changed behaviour—students felt accountable and brought "a different level of awareness."
Immediate, judgement-free feedback
24/7 availability and non-judgemental responses reduced anxiety. "With AI, there's no stupid question."
AI as supplement, not replacement
Students positioned AI as an "assistant" while valuing human educators for depth, stories, and social connection.
Conditional trust
Trust depended on perceived accuracy, institutional vetting, and professional presentation. Many cross-referenced outputs.
The accessibility-laziness paradox
AI removes barriers but creates temptation for shortcuts. Students recognised needing conscious effort to avoid over-reliance.
Academic integrity concerns
Students worried about cheating, unclear sourcing, and the inability to distinguish AI-generated work.
Inconsistent institutional support
Mixed messages, stigma, and lack of training left students teaching themselves. "I learned all of this by myself."
Career preparation gaps
Students see AI as essential for future careers but feel their education doesn't prepare them for it.
The Core Tension
Students
- Eager to use AI tools
- See career relevance
- Developing critical approaches
- Want guidance and training
Institutions
- Often resistant or hesitant
- Inconsistent policies
- Limited faculty training
- Creating stigma around use
"University is like 100% against AI... They instil a fear of AI, unfortunately. If I'm sitting in my research lab and I have to use ChatGPT... I wouldn't want my professor to see that, because it's just so stigmatised."
Students are already adapting to AI—experimenting, evaluating, and developing strategic approaches. They prefer interactive, feedback-rich tools over passive formats, and they want AI as a supplement to human teaching, not a replacement. The question is whether institutions will provide guidance or leave students to navigate this transition alone.
Implications for Practice
Design for interaction
The most effective tools engaged students through retrieval practice, Socratic dialogue, and hands-on manipulation—not passive content delivery.
Provide immediate feedback
Tools that helped students identify knowledge gaps in real-time were consistently valued over those that simply delivered content.
Develop clear institutional guidance
The gap between student readiness and institutional support creates confusion and pushes AI use underground.
Integrate AI literacy into curriculum
Students anticipate working alongside AI but feel unprepared. This needn't require new courses—AI skills can be woven into existing subjects.
Access the full study
Corbett, B. J., & Tangen, J. M. (2025). Student experiences with AI-powered learning tools in higher education. Manuscript in preparation.