National Student Survey 2014

Well if the second week of August wasn’t  busy enough already, with A-level results and the onset of clearing, then just to give something else for HE wonks and award leaders to think about, along comes this year’s National Student Survey results.

NSS-results-2014-letterbox (1)

As HEFCE announce, student satisfaction has risen nationally once again, with 86% of students saying they are satisfied and “satisfaction has either improved since 2013 or stayed the same in each of the seven categories covered by the survey”.

Professor Madeleine Atkins, HEFCE Chief Executive, said:

‘I’m delighted to see record levels of student satisfaction this year, as well as marked improvements in satisfaction with assessment and feedback over the last decade.

‘The NSS is the largest survey of its kind in the UK. Over the last 10 years it has helped over 2 million students to make their voices heard about the things that matter to them, and has been fundamental to driving change in our universities and colleges.

‘In a period of technological advance, internationalisation and funding reforms, the NSS will continue to enable students’ views to be heard and to stimulate innovation and excellence in teaching and learning in our universities and colleges.’

HEFCE also provide links to recent review of the NSS which considers how effective the current survey is and makes recommendations for changes, primarily around adding questions on “student engagement”.  The report also talks about methodological issues related to the use of the survey, stating:

The NSS results can be used responsibly in the following ways with proper caution:

  • To track the development of responses over time
  • To report absolute scores at local and national levels
  • To compare results with agreed internal benchmarks
  • To compare the responses of different student groups, including equity target groups
  • To make comparisons, with appropriate vigilance and knowledge of statistical variance, between programmes in the same subject area at different institutions
  • To help stimulate change and enhance dialogue about teaching and learning.

However, they cannot be used responsibly in these ways:

  • To compare subject areas, e.g. Art & Design vs. Engineering, within an institution unless adjustments are made for typical subject area differences nationally
  • To compare scores on different aspects of the student experience (between different scales, e.g. assessment vs. teaching) in an unsophisticated way
  • To compare whole institutions without taking account of sources of variation such as subject mix and student characteristics
  • To construct league tables of programmes or institutions that do not allow for the fact that the majority of results are not materially different.

Academics and other commentators  have long been critical of the usefulness of the NSS, and on publication, HEFCE asked via Twitter whether it was fit for purpose….

nsstwitter

The Times Higher ran an article on views from the sector on the usefulness of the NSS, which claimed that one lecturer at a university in the West of England described the NSS as “about as scientifically useful as TripAdvisor is for travellers”.

Nonetheless, it is the instrument we currently have, and so for the coming year as an institution we will be looking at our results and how to use them effectively.

At institutional level we have seen another improvement – just like the sector overall, we improve on a yearly basis.

nssoverall

While acknowledging the difficulties of comparing dissimilar subjects, it’s relatively easy for us to benchmark subject groups against other institutions using the full data-set from HEFCE.

We will also look at those individual awards that appear as outliers in our results – those awards that gained 100% overall satisfaction (I can’t name them all as not all the data can be used publicly) should be a source of ideas to those whose results were outliers at the other end of the scale.