Another day, another league table. This time it’s the Times Higher Education Student Experience Survey.
Rather than the large sample used in National Student Survey, this survey is based on a focus group of students recruited through UCAS, who were questioned in 2012-13. For this university the sample size was 116.
The article alongside the data states:
All respondents were members of YouthSight’s student panel – who are recruited via Ucas – and their views were gathered between October 2012 and June 2013.
The Times Higher Education Student Experience Survey is broken down into 21 attributes of universities, chosen by students as key indicators. Participants were asked to rate how their university performed in each of the areas using a seven-point scale. Each attribute was assigned a weight reflecting its importance within the overall student experience.
The same wording and weighting methodology have been used for the past five years, with the greatest weight applied to the attributes that correlated most to whether or not the respondent would recommend the university to a friend.
Only universities achieving 50 or more ratings have been included in the final dataset, and each university’s score was indexed on a scale from one to 100. A total of 111 institutions (102 last year) met the minimum sample threshold required based on respondents from a total of 14,300 respondents.
The difference in scores of similarly ranked institutions will not be statistically significant. When results are based on a sample of 100, we have to accept some imprecision to arise from sampling variability. But that does not mean to say that these results are without meaning. In this context, the relatively high level of consistency in our data from year to year is reassuring. For example, in each of the past four years, the universities of Sheffield, East Anglia, Dundee, Oxford, Cambridge and Leeds have all featured in the top 10 – this consistency demonstrates the impacts of best practice as opposed to sample variability.
So of the universities that showed significant rises or consistent high rankings – what do they suggest is the reason?
Sheffield – academic skills classes and the chance to learn a foreign language, culture of listening to students
Bath – good industry connections, sports facilities, involving students in decisions, even the design of some of the new accommodation buildings, dedicated student experience forum made up of students, senior academics and service staff heads
Falmouth – investment in teaching facilities, the development of a mentor scheme for incoming students and the introduction of more counselling and living support staff
Stirling – Reduced class sizes, improved student feedback and having employability embedded into its degrees
The article notes that post-92 universities, and in particular those aligned to million+ tend to have a more diverse student body, with mote mature students, and who are likely to be less satisfied.
Common themes from the article about how to succeed in student experience seem to revolve around involving students in decision making, genuinely responding to concerns and providing a wide forum for debate as well as embedding employability and improving feedback.
And as for the score for our university – well a disappointing drop (and difficult to understand when in the same year our NSS figures improved). Mind you in the previous year we had a significant climb of 14 places, which does bring into question how reliable such a small sample can be.
Comparing our scores against the means, then our biggest outliers are: Good social life; Good extra-curricular activities / societies; Good community atmosphere;Good accommodation.