By Tom Lowe
A central theme in student engagement strategies in higher education is the importance of amplifying students’ voices in relation to their educational experiences, particularly in quality assurance and enhancement activities. The UK sector, which provides the basis of my experiences, has seen an almost complete provider adoption of student voice practices (e.g. engaging with course representatives), as a ‘business as usual’ approach to annual operations. Student engagement in the development of education likely impacts hundreds of individual enhancements daily in our sector. These impacts can be anything from small-medium scale changes of course design to large scale institutional and sector-wide changes. It is an activity that tests our thinking, our processes, and is often a feature of our committee conversations, in which student voice data informs many of our decisions. Often in these latter contexts the question is frequently asked of colleagues - “have you consulted students on this?” If it impacts students, the answer should always be ‘yes’, we hope, but this blog will ask whether there is a greater question that also needs to be asked. When does student voice data go ‘out of date’? Does student voice data have a ‘use by’ or ‘expiration date’ before it becomes no longer relevant to the student cohort currently within the institution?
The Student Voice Sector of UK Higher Education
This spring, a team from the University of Westminster supported by a consortium of universities will be publishing a sector audit on student voice and representation practices in UK Higher Education. The findings from this audit have highlighted that the sector remains committed to student voice, from democratic course representatives to module-based course surveys, paid student-staff partnership research projects, and students’ union membership on a large proportion of university committees. If one positions students’ feedback as a source of data - whether survey data results, or the verbal input of a student contributing to a decision - this information can be seen as a fixed set of insights to inform decision making for many years to come.
It is commendable to have seen through this audit the UK sector’s wide adoption of student voice; however it is also clear that practice significantly varies. From variation in payment per hour, differences between democratically elected or selected student representatives, lengths of ‘service’ in particular roles, etc. Our report, published with the Quality Assurance Agency for Higher Education later this spring, will highlight this sector variance – but I’d also like to explore a little further in this blog the question of student voice data timeliness.
When do we engage with students?
The point at which we engage students in student voice activities varies considerably. A major area of these such activities is our engagement of student course representatives. In these initiatives, or in activities such as student focus groups, if the current student themselves is in the room (or virtual room), we could argue that this is ‘live’ student voice data, as it is coming from students living the current experience we are asking about. I’ve covered elsewhere the questions of the ‘representativeness’ of the student voice, so I’ll keep my focus on the temporal factor of student voice for this blog. In these cases, we are often asking students about their experience, to inform decisions today (i.e., how can we react/ respond/ evolve to and for the current student experience). Going beyond course representatives, specifically conducted research, evaluation, or consultation studies for future decisions or changes, often finds us asking students today to provide the voice of the students of tomorrow. Then finally, shifting student voice chronologies again, when student leaders, such as students’ union representatives, speak with senior leadership to comment on the broader matters of student experience, they can often reflect on their prior years’ experience of studying at the institution.
The second most dominant source of student voice is data in the form of survey responses, that may be qualitative (direct quotes or thematised analysis from open questions) or quantitative (most often in the form of Likert scales). Survey data often has a lag because despite the fact that distributing and analysing digital surveys now is easier than ever before, there is always some time between these stages and the dissemination of findings – the key word for our line of questioning today being ‘some’ here. This may be a few weeks between a mid-module evaluation collection and assessment, or, as in the case of the UK National Student Survey (NSS) of satisfaction we find ourselves looking at data from now-alumni captured 6 months prior to the present day cohort. Furthermore, we will often find a committee reviewing student voice data from more than a year ago or as a compilation of many years to explore data for trends. I can personally attest to seeing many times in my career student voice data from multiple years ago informing committee decisions for today. With this in mind, I might draw our attention to thinking about the large scale differences in student experience in the years between 2019 and 2024. The student voice captured here was bound by its context.
This brings me back to the question of time for student voice: does it have a use-by date? If so, when? And what factors impact this consideration?
Last week’s milk for tomorrow’s breakfast
Considering these potential time delays, we can often remark that we are making decisions for next year’s students, based on last year’s data. This is a point of tension for many in the sector, as the multiple factors that influenced a group of students’ experiences (or even a single course representative for that matter) are incredibly varied and complex, and possibly quite time bound. Universities are constantly evolving, whether that’s through enhancing services, developing curricula, or something as simple as lecturers rotating which modules they teach. A major influence or event, such as an individual lecturer or a fantastic (or disastrous) learning experience may also hugely impact the data, and therefore, might inform a future decision being made based on this.
I recognise that we cannot move towards continuous and overbearing amounts of student voice activities with students in order to always get the student voice from ‘right here, right now’ – particularly as they are already over-surveyed. However, the questions raised here are worth considering. Perhaps it depends on the topic on which students are providing feedback – feedback on cafeteria food menus are likely to stay reasonably static, but opinions on individual modules may ebb and flow with differences in cohort opinions. I also wonder what the students think to this question – do they see a timestamp on student voice data and think “that does not represent students today”, and if so, what factors influence this response. I’d be very curious to know.
Critically reflecting on student engagement
Ending how I began this blog, it fills me with great pride to be part of a higher education sector so clearly committed to student voice activities. However, it is important to critically reflect on the representativeness of our student voice data, the method of feedback collection and the timeliness of the data received. All research is bound by its context and therefore, considerations of the circumstances when, where and how the data was received is important. Finally, these considerations do not take away the importance of engaging in the learner voice, but ask us to be careful and transparent in a busy data world, where everything is measured and used to inform important decisions. The human conversation remains the most authentic, and I always recommend that when faced with an uncertain moment in higher education decision-making, a discussion with current students is the best place to start.
Tom Lowe has researched and innovated in student engagement across diverse settings for over ten years, in areas such as student voice, retention, employability and student-staff partnership. Tom works at the University of Westminster as Assistant Head of School (Student Experience) in Finance and Accounting where he leads on student experience, outcomes and belonging. Tom is also the Chair of RAISE, a network for all stakeholders in higher education for researching, innovating and sharing best practice in student engagement. Prior to Westminster, Tom was a Senior Lecturer in Higher Education at the University of Portsmouth, and previously held leadership positions for engagement and employability at the University of Winchester. Tom has published two books on student engagement with Routledge; ‘A Handbook for Student Engagement in Higher Education: Theory into Practice’ in 2020 and ‘Advancing Student Engagement in Higher Education: Reflection, Critique and Challenge’ in 2023, and has supported over 40 institutions in consultancy and advisory roles internationally
Readings
Austen, L., 2020. The amplification of student voices via institutional research and evaluation. In A Handbook for Student Engagement in Higher Education (pp. 164-176). Routledge.
Cuthbert, A., 2025. How representative are elected student representatives? A Literature Review. Student Engagement in Higher Education Journal, 7(1), pp.185-200.
Fletcher, A.F., 2017. Student voice revolution: The meaningful student involvement handbook. CommonAction Publishing.





