← Research Projects

AI in Higher Education

Student Experiences with AI Learning Tools

What works, what doesn't, and what students actually want

Participants N = 21
Methods Surveys + Focus Groups
AI Tools Tested 8 Different Tools
Scroll to explore
01

The Context

AI tools are flooding into higher education, but we know surprisingly little about how students actually experience them.

92% of UK undergraduates now use AI tools—up from 66% just one year prior. Yet universities struggle to provide clear guidance, and educators often choose AI tools with limited insight into what students find genuinely helpful.

We designed a study to find out: across different AI formats and subject areas, what do students actually value?

92% of students using AI in 2025
5% of educators pay for premium AI
02

The Study

21 university students completed three one-week modules—cognitive psychology, nutrition science, and computer programming—each featuring a suite of AI-powered learning tools.

The AI tools we tested

Interactive

QuizBot

Retrieval practice with AI-generated questions

Interactive

Coggie

Socratic dialogue tutor

Interactive

Interactive Activities

Hands-on exercises and visual demos

Interactive

Client Simulation

Voice-based counselling practice

Interactive

ExampleBot

Real-world application generator

Interactive

Reading Assistant

On-demand concept clarification

Passive

AI Lectures

Avatar-delivered video content

Passive

AI Podcasts

Conversational audio summaries

Surveys

Ratings on effectiveness, engagement, motivation, satisfaction, and future use after each module

Focus Groups

Semi-structured discussions exploring experiences, preferences, and broader attitudes toward AI

03

The Clear Winner

Interactive tools that required active participation consistently outperformed passive formats.

"With the tools where you would have to respond... I was actually doing something... like active learning. But if it was something like the lecture or the podcast, I would find myself zoning out."
— Study participant
"The AI lecture was robotic and monotonous... there's an uncanny valley effect. In person lectures can be funny, boring, sometimes even start singing—there's unexpected elements."
— Study participant
04

What Students Told Us

Focus group discussions revealed nine themes about how students experience AI in education:

1

Passive formats lack connection

AI lectures triggered "uncanny valley" effects. Missing spontaneity and human warmth made them hard to engage with.

2

Interactive tools foster reflection

Tools requiring responses changed behaviour—students felt accountable and brought "a different level of awareness."

3

Immediate, judgement-free feedback

24/7 availability and non-judgemental responses reduced anxiety. "With AI, there's no stupid question."

4

AI as supplement, not replacement

Students positioned AI as an "assistant" while valuing human educators for depth, stories, and social connection.

5

Conditional trust

Trust depended on perceived accuracy, institutional vetting, and professional presentation. Many cross-referenced outputs.

6

The accessibility-laziness paradox

AI removes barriers but creates temptation for shortcuts. Students recognised needing conscious effort to avoid over-reliance.

7

Academic integrity concerns

Students worried about cheating, unclear sourcing, and the inability to distinguish AI-generated work.

8

Inconsistent institutional support

Mixed messages, stigma, and lack of training left students teaching themselves. "I learned all of this by myself."

9

Career preparation gaps

Students see AI as essential for future careers but feel their education doesn't prepare them for it.

05

The Core Tension

Students

  • Eager to use AI tools
  • See career relevance
  • Developing critical approaches
  • Want guidance and training
vs.

Institutions

  • Often resistant or hesitant
  • Inconsistent policies
  • Limited faculty training
  • Creating stigma around use
"University is like 100% against AI... They instil a fear of AI, unfortunately. If I'm sitting in my research lab and I have to use ChatGPT... I wouldn't want my professor to see that, because it's just so stigmatised."
Key Takeaway

Students are already adapting to AI—experimenting, evaluating, and developing strategic approaches. They prefer interactive, feedback-rich tools over passive formats, and they want AI as a supplement to human teaching, not a replacement. The question is whether institutions will provide guidance or leave students to navigate this transition alone.

06

Implications for Practice

Design for interaction

The most effective tools engaged students through retrieval practice, Socratic dialogue, and hands-on manipulation—not passive content delivery.

Provide immediate feedback

Tools that helped students identify knowledge gaps in real-time were consistently valued over those that simply delivered content.

Develop clear institutional guidance

The gap between student readiness and institutional support creates confusion and pushes AI use underground.

Integrate AI literacy into curriculum

Students anticipate working alongside AI but feel unprepared. This needn't require new courses—AI skills can be woven into existing subjects.

Working Paper

Access the full study

Corbett, B. J., & Tangen, J. M. (2025). Student experiences with AI-powered learning tools in higher education. Manuscript in preparation.