Learning Lessons from School

Often at this time of year, I write a “review of the year” blog piece, summarising my writing over the year, and what HE and my own university in particular have gone through. That all changes this year – my university and I parted company in March (I think I’m allowed to say that) – so there is little to review from that perspective.

However, it was still a big year in HE – a new Higher Education and Research Act, the first gold silver and bronze TEF awards, and now a consultation on the role of the new OfS.

In all of these, the importance of student experience, and particularly use of metrics to demonstrate how well a university performs is paramount, ostensibly to allow prospective students to make choices, but most likely to allow league table compilers, journalists and others to make (specious?) comparisons.

So, if I can’t write much about my experiences of HE this year, what can I share? Since September I’ve spent most of my time both as a postgraduate student, and learning to teach in secondary school classrooms, and there are three lessons I can take from there that, if I were a subject lead in HE, I’d be considering.

I fully accept that university lecturers and school teachers do different jobs. But if you’re in a teaching intensive post-92 university, chances are that some of what you need to do is not that different from a teacher in a local academy or school.

Lesson One – Assessment and Feedback

Pretty much every university and course gains poor scores in the National Student Survey for this. Learning and Teaching committees agonise over it, develop complicated feedback procedures and guidelines, set minimum times to return marks (15 to 20 days), but wonder why students are still unhappy about feedback.

Maybe it’s because they compare with their school experience. Assessment happens constantly. Formative assessment (peer or self assessed maybe) in nearly every lesson. Personalised verbal and written feedback every couple of weeks. If students take a test, marks and feedback are returned within the week. This is the expectation of students – universities could think about how to develop a transition to enable them to adjust to university approaches, but equally support them in those early, vulnerable weeks.

Lesson two – Learning from Teaching Observation

Peer (or heaven forfend, management) observation of teaching in universities is, at its best, a collaborative experience, sharing good practices and relying on professional approaches to self development. However for many this is based on being observed for possibly one hour a year and maybe observing others for maybe two or three hours.

New lecturers may be observed a little more in their first year as part of any post graduate certificate they are taking.

Contrast this with the school experience. As a trainee teacher I am currently observed teaching for 10 hours a week. Every session that I deliver. And at the end of each session, there is a four page feedback form where I scored against 8 standards, with 3 levels of competence. Advice is provided on what worked well, and what I can do to improve. Targets are set for the next week’s season, and I have to provide my own reflection. Guess what – my teaching improves each week, and I develop new techniques and new ideas from the advice I get. I also go and observe a range of other people whenever I want to.

If we want to make big changes in teaching practice, and expose people to a greater number of great teachers and different ideas, then the annual observation round leaves much to be desired, as does the way in which new lecturers are suppported.

Lesson 3 – Know your Students

It’s well now that a sense of belonging aids student retention and attainment, as evidenced in projects such as the Paul Hamlyn/HEA What Works project, as well as universities’ focus on developing course identity.

I talked to a university undergraduate course leader the other week, with maybe 30-40 students in each year. They said that they did not know the names of all their final year students, let alone the others. In school, I’m expected to know the names – and use them – of all my pupils. So in a class of over 30 11 year olds, I can talk to each of them individually, and know a little about them and their abilities. And importantly, I should be able to do this within a couple of weeks of meeting them.

If this is what pupils are used to in school, it’s maybe not surprising that they don’t feel a sense of belonging when they first come to university.

Conclusion

So what’s to be done?

Firstly, I don’t think universities need to replicate schools: the two sectors have different functions, cultures and behaviours. However, a really hard look at transition to university, again particularly in teaching focussed universities whose students may lack some of the cultural capital to thrive instantly, could provide a way of maximising student engagement and attainment.

Seriously, why not send your teaching staff to spend two weeks in a secondary school, shadowing a teacher, and seeing what the school experience is these days – and then reflect on this to see if you could develop your first semester, in terms of formative assessment, teaching practice and sense of belonging to really help your students.

Does UK HE have a retention problem?

Last night I attended an event at King’s College London, hosted by UPP Foundation and Wonkhe, looking at retention issues in UK higher education. The format was a series of initial thoughts from each of 5 panel members, followed by a lively discussion, showing the importance of this topic.

wonkheupp

Richard Brabner

Richard introduced this as the second of three workshops on student journey. He pointed out that  HESA stats on  non-continuation show that this is getting worse, and especially for students from disadvantaged backgrounds. he reminded the audience that in light of this Les Ebdon of OFFA expects next access agreements to focus on retention.

Liz Thomas

Liz stared by explaining that UK figures for retention are in fact much better than most European countries. In those countries with free tuition, then there was a feeling that getting students out of the system was part of the quality system. In awold domintae dby fees and student loans, then this attitude cannot prevail. We admit students to our courses and so we have obligation to help them succeed. So we do have an issue around student success and retention, in particular around differential levels of success, retention and  plus employment outcomes when we consider BME, WP and other factors.

From the HEA/Paul Hamlyn What Works  project it was clear that learning and teaching is critical to student success and retention by building a sense of belonging in the academic sphere. This goes beyond curriculum, but is about the whole institution recognising that it needs to make students successful, and needs to consider role and contribution of all staff.

Sorana Viera

Sorana of the NUS believes that the UK HE does have a retention problem for some groups of students and suggested that an unforeseen consequence of TEF is that game-playing to satisfy the metrics could exacerbate the situation.  The NUS view was that the rushed nature of TEF potentially leaves dangerous holes. Since the key metrics that universities can impact is non continuation then all eyes should be on retention.
Universities should  invest more in those supporting activities that are evidence based, and Soranna cited the What Works project as an example of this. If evidence is presented in accessible ways, then NUS will champion it.

In particular, the impact for commuting students was raised – these are students with financial pressures, family and work commitments, who may have chosen to study at a local university which may not be the right university for them.

Alex Proudfoot

Alex showed that some of the issues for alternative providers are quite different. Students are much more likely to be from a  BME background, or be aged over 30, so these providers are dealing with very different cohorts of students.

A focus for alternative providers was on delivering courses that focus on employability by creating industry links and ultimately an industry community within the college where staff and students might collaborate on projects outside of class.

In terms of pathways and transitions into HE, students who go through the same provider from level 2 and 3 have better retention at HE levels.

For students with low entry qualifications, then classes on study skills are a compulsory part of curriculum, rather than be in an additional optional choice for the student

Ross Renton
Ross highlighted the huge differences in retention and success based on ethnicity. he emphasised the need to develop an understanding who is joining your university or couerse, and developing a relationship with them before they arrive or join the course.

At Hertfordshire they had previously targeted POLAR quintile 1&2 students on entry, and provided peer mentoring plus other additional activity,tailored to each student. Retention figures improved by 43% for these students, and DLHE shows better rate of graduate employment. This intensive personalisation works but is expensive

Ross also highlighted the fact that problems need to be owned by everyone – it’s not a matter of sending a student off to some student hub, but all academic staff need to take ownership. There is also a need to systemise personal tutoring, so that key and meaningful conversations take place at the right times for all students, including at all transition periods, long holidays etc.

In the future Ross saw some risk in being overly focused on the use of metrics and analytics – this is still about people working with people.

Panel Q&A

Key points in the Q&A session were around:

  • How do we support hourly paid lecturers- not delivering HE on the cheap, but supporting the right staff properly
  • The current retention metrics don’t allow for students to step out of HE with interim quals in a flexible framework
  • Staff also need to feel that they belong, so need to consider institutional culture.
    How do you support students through whole institution approach.
  • How can we build success in L&T including retention and success into reward and recognition for staff?
  • How do we making the campus more “sticky” for students living at home? The research on commuting students suggests that these students feel the campus is not for them and they feel marginalised and invisible. Details in prospectus will cover accommodation but not local travel. Universities were often not set up to  support these students, expecting them to be in 4-5 days a week.
  • Tax burden for those who drop out but have student debt – ethics and who should pay? 1 yr of study should be seen as a success
  • Can we use analytics to create better informed interventions as otherwise it is difficult to personalise in mass system without good real time information.

Takeaways

Certain key factors stand out:

  • The need to look carefully at differential retention and success, and to ensure that TEF does not drive perverse behaviours
  • The opportunities to use better analytics to personalise student support
  • The need for rigorous and meaningful personal tutor systems
  • A pressing need to understand how a sticky campus can support commuting students and meeting their specific needs.

 

EdTech futures in the Connected University

Digital technology is bringing huge changes to all industries and sectors, not least higher education. It isn’t the future, it’s the present. This article summarises three recent publications, firstly the annual NMC Horizon report that I’ve previously blogged on here; a talk by Steve Wheeler, the keynote speaker at last years Learning’s and Teaching Conference, and finally a piece by Eric Stoller, who will be delivering a keynote at this year’s conference.

Firstly let’s look at this year’s NMC Horizon report. This is categorised into:

  • Key Trends Accelerating Higher Education Technology AdoptionNMC 2017-1
  • Significant Challenges Impeding Higher Education Technology AdoptionNMC2017-2
  • Important Developments in Technology for Higher EducationNMC2017-3

Usefully NMC have provided a summary of their predictions from previous years, and it’s worth noting that not all of their predictions come to pass; equally some remain on the radar for a number of years. Audrey Watters has previously provided a critique of NMC for those who’d like a different view.

Nonetheless, this is a useful starting point, and we can map our own activities against all of  the 18 trends/challenges/developments, but here I’ll focus on a few.

As we walk around this campus (and many others in the UK), we can see how learning spaces are being transformed to allow different ways of learning to take place.

We have a major focus on improving staff and student digital capabilities, recognising that this will help drive innovation, as well as improve employability prospects of our graduates.

The achievement gap is one I have blogged about previously – this continues to be a difficult multi faceted probelm. Technology will not provide all the answers, but may help level the playing field in some areas.

The possibility of a very different LMS in the future is tantalising. We know that current systems such as BlackBoard and Canvas are very good at managing learners and resources – making sure the right information is provided to the right people at the right time. Changes to the way in which staff and students collaborate through co-creation and sharing could render this form of LMS redundant in future.

Away from the NMC report, Steve Wheeler of Plymouth University presented on what’s hot and what’s not in learning technology. The video is well worth watching.

Steve identifies a huge range of technologies that will likely have an impact: voice controlled interfaces; gestural computing, the Internet of Things (pervasive computing); wearable technologies;artificial intelligence; touch surfaces for multitouch multiusers; wearable tech; virtual presence; immersive tech such as Oculus rift for VR and AR; 3D printers and maker spaces. The list goes on.

Steve identified three key elements for the future:

  • Very social
  • Very personal
  • Very mobile

and this needs to be underpinned with developing digital literacy, particularly when wading through alt-facts and fake news. Our students need to learn how to check the veracity and relevance of materials.

Steve postulates that until the development of the PC or web, everything was teacher centred. Technology allows us to become learner-centred, but have we adjusted enough to being learner led?

This should impact the way in which we assess- education and training must go from recursive to discursive, no longer repeating or regurgitating materials from the teacher, but through a  discursive approach developing problem solving skills etc.

  • The changes are
  • Analogue to digital
  • Closed to open
  • Tehthered to mobile
  • Standardised to personalised
  • Isolated to connected

 

Finally, a new blog post from Eric Stoller looks at “Student Success, Retention, and Employability – Getting Digital in a High Tech, High Touch Environment”.

Eric identifies that the more engaged a student is during their university experience, the more successful they will be. Digital offers us the opportunity to increase the channels through which we communicate with and engage with our students.

Eric (as well as Steve above, and the NMC report) highlights the importance of digital capability, particularly through the lens of employability. Students need to graduate with the digital skills they will use in the workplace, not just those that they use to complete a university course. Interestingly Eric also highlights the need to teach students about their digital presence and identity.

Finally he refers to the existence of a digital divide (again identified by NMC as digital equity) – “If your university is students first, that means all students”. This a a challenge that focusing on providing the right kit, but more importantly developing the right skills an behaviours means that we can get all staff and students to engage in a connected digital future.

Last year we enjoyed Steve Wheeler’s presentation at our Learning and Teaching Conference – I can’t wait to hear Eric Stoller later this year at the same event.

 

 

 

Differences in Student Outcomes

Successful outcomes for students are often used as a proxy for institutional quality, hence the use of good degree outcomes, or value added, in league tables. The forthcoming Teaching Excellence Framework will almost certainly look at student outcomes as a measure also. However, not all students succeed equally, and we know from our own work at StaffsUni of the gaps in attainment between different groups of students.

The recent Green Paper, as well as highlighting the possible future TEF, indicates the government’s desire to see an increase in numbers of students from the most disadvantaged backgrounds as well as looking to ensure that all students can achieve.

In the light of this, last Monday I attended a HEFCE conference in London “Addressing differences in student outcomes: Developing strategic responses”, which looked at the findings of research into differential outcomes from Kings College London, and was an opportunity to hear from others in the sector on how they are tackling these issues.

Sessions attended were: the introduction by Chris Millward, Director of Policy at HEFCE; a presentation by Anna Mountford Zimnars of KCL;  a session by Sorana Vieru and Malia Bouattia  of NUS, and finally a session by Philip Plowden, DVC of University of Derby.

These are my notes of the day. Copies of the presentations can be viewed here.

Chris Millward HEFCE Director of Policy

Chris Milward started by considering where the government is on this agenda, linking the Green paper, the Treasury plan and plans from BIS.

Government wants to see a more diverse range of backgrounds in HE, in terms of entry, success and outcomes. For instance: double the number of students from disadvantaged backgrounds by 2020; an increase in the number of BME students by 20% by 2020, and to the sector to address differences in outcomes.

This means more responsibility for universities together with strengthened guidance to OFFA and the potential role of the Office for Students. There is an anticipated stronger role in quality assurance processes through the impact of TEF and the future need to measure difference in outcomes based on data and metrics agreed by government. This will lead to more targeted funding together with more emphasis on meeting obligations.

The HEFCE analysis shows an attainment gap for BME students, based on A-level analysis and the more that you add in other factors, the bigger the gaps become.

In addition, when looking at POLAR3 domicile, then there are further unexplained HE outcomes.

When considering students with disability, then the data suggests that those students who received DSA support perform above average, while those without perform less well.

On postgraduate progression, there is currently an unexplained difference in outcomes based on POLAR3 quintiles.

When considering employment and looking at the 40 month survey rather than the 6 month DLHE, all POLAR3 quintiles have worse outcomes than quintile 5 and for professional employment in particular. There are worse outcomes for students with disability, irrespective of DSA and there are worse employment outcomes for all categories of BME students and particularly in professional employment. Finally on gender, men perform worse overall on employment, but better in professional employment.

The HEFCE approaches to working on closing the gaps in outcomes include:

  • National outreach programme
  • Funding for disabled
  • Supporting successful outcomes
  • Catalyst fund

ANNA MOUNTFORD ZIMNARS – KCL

Dr Zimnars presented the outcomes of major piece of research into differential outcomes, which is available here.

“Access without success is no opportunity”

The research considered three questions:

  • What is the pattern- empirical?
  • How do we explain it – causal model?
  • How do we change it effectively- policy and empirical?

The question was asked – “Do we need causality- if intervention works, does the causal model matter?”

Explained pattern of differential attainment using model that looked through a lens of macro/meso/micro  levels and at experiences of preHE, HE and postHE.

4 explanatory dimensions were proposed:

  • Curricula and learning
  • Relationships -sense of belonging probably the most important factor
  • Cultural, social and economic capital
  • Psychosocial and identity factors

From the research, which involved asking questions of a large number of institutions, the level of awareness of the issue differed across institutions, although this may be changing now, possibly due to the proposals in TEF.

In terms of those institutions that tackled the differential outcomes issues the most successfully:

  • Whole institution effect is most successful
  • Need students academics and prof services working together
  • Bottom up approaches with strategic support
  • Universal and targeted interventions

Effective interventions were seen to be:

  • Improvements to T&L
  • Inclusive learning and curricula
  • Deconstructing assessment
  • Meaningful interactions
  • Role models and mentoring
  • Engagement with institution
  • Generally few evaluations especially a lack of long term evaluations

Ended with 5 groups of recommendations

  • Evidence base
  • Raising awareness
  • Embedding agenda
  • Staff as change agents
  • Students as change agents

Sorana Vieru and Malia Bouattia  NUS

 This presentation started from a previous NUS report, Race for Equality, and went on to look at a new NUS campaign on liberating the curriculum.

From previous NUS work, 42% of students said that the curriculum did not reflect their experiences particularly in history and philosophy. As well as looking at students as being in one particular demographic group, it was important to look at intersections between groups.

Work from NUS highlighted:

  • 23% of black students described learning environment as cliquey
  • Disabled students more dissatisfied in NSS
  • 10% of trans students not willing to speak up in class
  • Black students report lower levels of satisfaction on NSS on assessment and feedback

There was a focus on liberation-equality-diversity and the launch of a new campaign – “Liberate my Degree”. An online hub has been provided with resources for officers and reps with training resources to allow them to engage in debate in their institutions and to support becoming co-creators of curriculum.

Getting there  – Helen Hathaway Philip Plowden

Speakers from University of Derby showed the pragmatic steps they have taken to challenge the gap in attainment between white and BME students.

In terms of background, the University has 28000 students, most of whom were state school sector. 20% of these self-identified as BME. The attainment gap was 24.6% in 2009-10.  The impact of the work so far is the gap has closed to 12.4% in 14-15, although there was an increase in attainment across all areas this is a moving target.

Important thing is that there is no one single answer, so there was a need to stop looking and focus on the myriad interventions and see what impact they have.

  • No magic bullet
  • Post racial inclusive approach
  • Suite of different strategies needed

Four main areas of interventions are used: Relationships, academic processes, psychological processes, and social capital.

The project at Derby explored data (down to module level) and relied on the regular Programme health checks which used a digest of metrics including attainment by ethnicity. In these, the DVC meets with programme leads to engage with course teams at chalk face. Areas covered include: outcomes,  finances reliance on clearing, and staff numbers. In particular the programme health checks looked at “spiky” degree profiles- looking at individual modules and gaps, not with an intention to play a blame game but to ask what is going right and ask others to consider that.

To support interventions, Derby developed PReSS- practical recipes for student success whch contains evaluations and case studies and can be used from: Http://uodpress.wordpress.com

The key lessons learned were:

  • No simple solution. Paralysis by analysis. Just have to crack on and do what works.
  • Learn from others
  • Post racial inclusive approach. Difficult to reconcile this with some of the morning’s talk. Is this unduly dismissive of liberation approaches
  • Importance of communication -degree of profile. But once in the mainstream it might get lost.
  • Need consistent way to measure attainment gap.
  • Important to evaluate interventions.

Points from Discussions

A lively discussion followed, and the following are just snippets of some of the topics – in some cases these reflect discussion we have had in our own institution, but I add them in almost as provocations for further debate.

  • Is there a threat to academic staff when we discuss this BME and other attainment gaps? A danger of appearing accusatory?
  • Why are there difference between subjects such as business and nursing – do cohorts have an impact? Why do the subjects with the smallest attainment gaps want to engage in the debate the most?
  • How do we check who uses the resources to support inclusive learning, and should we check?
  • How do you liberate the curriculum and how do we re-educate staff to draw on a wider range of ideas, since they are a product of their own subject and environment?
  • What about the Attainment gap for students who live at home where home life and working gets in the way of study?

Conclusions

In all, a thought provoking day. A lot of emphasis, as always on the BME attainment gap, but also more opportunity to explore attainment more generally and to recognise how this agenda will become increasingly important post-TEF.

In terms of what we could do next, then as we develop better internal metrics of modules and courses, we can start to see how we can use this information to understand better the outcomes that our students achieve. Linking this to revisions in the way in which we review our courses, both from a quality assurance and enhancement perspective, as well as a more data-centric health check would provide the opportunity to have the right discussions, to ensure that we maximise the opportunities for our students to be successful.

 

Times Higher Student Experience Survey 2016

Just before we enter league table season, the THE kicks off with their Student Experience Survey results.

This year the top university is Loughborough, followed by our geographical neighbours, Harper Adams, and then Sheffield.

Here’s what the VC of Loughborough attributes the success to:

Robert Allison, vice-chancellor of Loughborough, says that coming first in this year’s student poll was “absolutely fantastic, as it recognises all the excellent things that staff and students are doing here”.

At the heart of Loughborough’s success is the ethos that students should work with staff to create a good university experience for everyone on campus, Allison says. “When people visit us on open days, I tell them that if they’re wondering if they’ll have a TV in their room, this probably isn’t the ­university for them.”

At Loughborough “you can really embed yourself in the university, and if you do, you will have all sorts of chances and opportunities”, he continues.

For instance, final-year students often partici­pate in a research project, while others take part in international secondments, such as those enjoyed by mechanical engineering students who have just returned from visiting the Massachusetts Institute of Technology.

“If you have that desire to co-create your university experience, rather than just seeing yourself as someone who shows up for 10 weeks a term, it takes you to a different level as a student,” Allison says.

As always, this is a survey based on a very small sample size compared with NSS, but the outcomes are still interesting.

Staffordshire has risen 10 places to 78th this year. In terms of where we do well, we can look to see where our scores exceed the sector average:

  • helpful/interested staff
  • personal requirements catered for
  • good personal relationships with teaching staff
  • cheap shop/bar/amenities
  • tuition in small groups
  • fair workload

So as we might expect, we do well in the way we work with our students, and we know that Stoke in Trent is a relatively cheap city in which to be a student.

Areas where we seem to be falling behind are around social life, community atmosphere and environment on campus. Our ongoing investment in campus transformation should god a long way to address this, and by September 2016 when all of our computing, music, film and games students arrive onto the main redeveloped campus, we should find ourselves working in an even more lively environment.

 

 

I can’t get no satisfaction

Early in January the next round of National Student Survey begins across the UK HE sector. This year for many will be seen as  a dry run for what is to come in later years – the widely discussed  Green Paper which refers to using metrics to help gauge teaching excellence. Once we get past the first year of everyone being equally excellent, then  NSS and DLHE are widely anticipated to be key measures to be used, as well as possible measures of learning gain, since the paper does hint at a lack of satisfaction with current degree classification processes.

So before we check on progress on last year’s action plans, and start to think about how we introduce this year’s survey to our students, a couple of publications from the last week are worth bearing in mind.

Firstly a research paper from QAA, written in part by Jo Williams of University of Kent, to which Staffordshire University made  a contribution. In this, “The Role of Student Satisfaction Data in Quality Assurance and Enhancement: How Providers Use Data to Improve the Student Experience” Dr Williams looks at how different types of institutions approach NSS, and shows that across all parts of the sector, institutions and senior staff questions NSS:

In particular, the issue of question 22 of the NSS, asking students about their overall satisfaction has created endless debates in academia, if not confusion, with professionals arguing that it is methodologically and professionally wrong to base league tables on a single question which is not in itself clear. Various senior academics we spoke to concurred with this theme.

The research did show that universities in all parts of the sector listen to what students say, and that they do make changes based on what the survey reveals, for instance:

The programmes change every year so sometimes it’s because the subject changes but very often it’s because students have expressed discontent with something. Therefore, you change the personnel teaching it. You change the way you teach it. You change the content, you change the assessment, you change the feedback, you change something about it. Or sometimes you just drop it.
(University B)

Changes in practice were noted across institutions:

Our data revealed that institutions have employed various changes in response to issues students raise in the satisfaction surveys. Among other practical changes, universities have:
– recruited academic advisors and officers to take charge of the NSS
– mapped internal surveys to mirror the NSS
– renewed their focus on learning and teaching revisited and improved timetabling systems
– raised structures including building sites, teaching rooms and sports complexes
– revisited their feedback and assessment mechanisms
– organised briefings with students to enlighten them about feedback and assessment, the NSS and its benefits
– replaced subjects, at times personnel, whose NSS scores keep falling
– introduced or empowered various forums and platforms to meet several times in a year to discuss the NSS, among others. Such forums found at nearly all institutions include: NSS forums, student experience action plans, education boards, NSS improvement forums and learning and teaching advisory groups

Across the institutions in the research, other similarities were see in how data was used: for instance comparing scores across schools, holding low scoring schools to account, and comparing with other institutions.

In terns of league tables, depending on where you appear in a league table appears to influence the behaviour of the organisation.

In particular, institutions placed in the top 25% of the league tables appear to have a relaxed view of the NSS. They appear to put particular emphasis on improving the student experience and argue that this automatically triggers a higher satisfaction rate than being ‘obsessed with the NSS’ and improving league table position:

Whereas at the other end of the scale:

In contrast, institutions in the lower 25% of the student satisfaction league tables appear to place particular focus on improving their student satisfaction and subsequently their standings in the league tables.

The main conclusion of the work then is that:

In particular, institutions in the top 25% of league tables
(Universities A and B) appear to prioritise improving the student experience and let the NSS take care of itself, while those in the bottom 25% (Universities C and D) prioritise their NSS league table position and subsequently employ various tactics to promote the surveys.
Despite institutions adopting different approaches to the surveys based on league table positions, institutions generally listen to students’ demands raised in surveys and have responded by instigating various changes including recruiting academic advisers and officers to take charge of the NSS; mapping internal surveys to mirror the NSS; raising structures including building sites and revising their feedback and assessment processes.

What the paper doesn’t consider is the relative ranking of NSS scores by institutions – it is perfectly possible to score well on certain NSS scores, and appear to out perform other institutions on such a single measure, but this may not change institutional behavours which may be set to focus on the NSS position, rather than overall experience.

In other work out recently, from Stephen Gibbons, Eric Neumayer and Richard Perkins writing in the Economics of Education Review “Student satisfaction,league tables and university applications: Evidence from Britain” (S. Gibbons et al./Economics of Education Review 48 (2015)148–164), the authors make the following points:

  • NSS scores have an impact on recruitment applications, but not huge
  • students do not appear to respond directly to quality cues from satisfaction scores
  • students may already have a well developed knowledge about product quality based on perceptions of reputation and prestige
  • student satisfaction and league table position do not have a short term effect on market demand
  • the degree to which quality indicators affect demand is strongly linked to the amount of market competition for a given subject

Finally, in “Applying Models to National Surveys of Undergraduate Science Students: What Affects Ratings of Satisfaction?”  (Educ. Sci. 2013, 3(2), 193-207) by Langan Dunleavey and Fielding of Manchester Metropolitan University, the authors look at what influences the results seen for question 22 – overall satisfaction. We are all familiar with reading through a set of results, with great scores for most of the questions, and sections, but a lower score for this final crucial question, which is the one used in all league tables.

The authors note the year on year consistency of results for individual subjects, noting how comparisons should be made:

Subjects were highly consistent year on year in terms of their relative performance in the satisfaction survey. This has implications for institutional decision-making particularly if subjects are wrongly compared against institutional averages, when comparisons should be made within subject areas (e.g., comparing with national subject averages, although this may be subject to error if courses contain different compositions of learners, for example in terms of ethnicity)

This is consistent with HEFCE advice, and why as an institution we provide sector average scores at JACS3 subject level for comparison.

Interestingly, questions about feedback were the weakest predictors of “Overall Satisfaction” whereas:

The best predictor of student satisfaction nationally for the three years analysed was “The course was well designed and running smoothly” followed by ratings of “Teaching”, “Organisation” and “Support”. This may vary slightly between subjects/institutions, so it is proposed that this type of quantitative approach to contextualising survey metrics can be used to guide institutions in resource allocation to tackle student experience challenges.

So our conclusions on how we approach the next NSS, and perhaps more  importantly NSS2017  could be:

  • carry on listening to students, responding and being seen to respond to surveys
  • make sure we focus on all the measures that make up a league table
  • make sure that courses are well organised and running smoothly
  • don’t expect league table moves to immediately be reflected in increased applications
  • and remember – the student experience is what really matters, not the survey itself.

2015 Student Academic Experience Survey

This year’s survey on Student Academic Experience has just been published by HEPI and HEA.

Under the headlines outlined by Nick Hillman of HEPI:

‘Course quality depends on more than contact hours and class size, but students do care deeply about these issues. They are notably less satisfied when they have fewer than 10 contact hours and classes of over 50 students. They also care more about whether their lecturers are trained to teach and have professional expertise than whether they are active researchers.

‘The most striking new finding is that a whopping three-quarters of undergraduates want more information about where their fees go. Providing this is coming to look like an inevitable consequence of relying so heavily on student loans. If it doesn’t happen soon, it could be forced on universities by policymakers.

‘The survey also provides the best available evidence on student wellbeing. Students are less likely to regard their lives as worthwhile and are less happy than others. This suggests good support services, including counselling, should be a priority despite the impending cuts.’

Looking at the results in more detail, there are interesting variations in the responses that students make depending on discipline, and on the type of university that they attend, as well as some useful lessons for us to learn, so I’ve picked out some highights

Overall Academic Experience

The key reasons cited for experience not being as expected were around not putting in enough effort, poor organisation and lack of contact hours

hepi2105-1

From an institutional perspective we can tackle this by being really clear abou how much work we expect our students to do outside of scheduled classes. Module handbooks and guides need to provide explicit detail on a weekly basis of what work should be undertaken, both to match expectation, but also to explain to students that learning is not an act of passive consumption, but one of active participation

Information, reflections on course choice and value for money

34% of students from England think they have received poor, or very poor value for money, although students with more contact hours and who do more independent study are more satisfied with value for money. As above, we need to make sure we are making it really clear to our students what we expect from them, and what we provide them with.

Interestingly,the students who were the least satisfied about value for money were also those who were least aware of how their tuition fees were spent

hepi2015-2

Maybe the message is two fold – firstly institutions need to be transparent on how fees are spent, and why they need to cover more than just tuition, and secondly we should be using course level talks and handbooks to reinforce the message to our students on where, how and why we spend our money, and how they benefit.

Workload and Class Sizes

The variations in workload by subject area are not in themselves surprising, with the highest loads in medicine, creative arts and the sciences

hepi2015-3

What does jump out though is the total number of hours some students are studying.

For instance, if  a student is studying 4 modules of 15 credits each  across 12 weeks, plus 3 weeks for assessment, there would be 600 learning hours in total. This should equate to 40 hours per week. Again, the message for us might be about how to we set that expectaiton?

Quality of Learning and Teaching

In the survey students were asked to comment on three characteristics of teaching staff:

  • whether they have received training in how to teach;
  • whether they are currently active researchers;
  • expertise in their professional or industrial field.

In the press articles that accompanied this publication, much was made of students stating that academic staff should have qualifications in teaching – overall 39% of students ranked this as the most important characteristic.

However, when we look at the different types of universities, this varies substantially:

hepi2015-4

For a million+ university such as us, then the most important characteristic that students are looking for is relevant industrial or professional experience, which might be expected with the vocational focus of this type of university. While the current government hasn’t proposed regulation of teaching in HE, it did commit to a “framework to recognise universities offering the highest teaching quality”. For us though, we need to focus on making sure our teaching staff have the opportunity to develop and maintain their professional expertise, as much as, if not more so, than gaining teaching recognition.

Finally on this – how would a student know if a member of teaching staff had a teaching qualification?

Students’ views on policy options

Students were asked how universities could save money. The answers are revealing:

hepi2015-6

Overwhelmingly students would prefer us to make savings on expenditure  on buildings and sports and social facilities, whereas they would not want to see cuts to teaching hours and to student support facilities. This might conflict with what we need to do to recruit students in the first pace – the scale of building and refurbishment in the sector has been huge since the 2012 increase in fee.  This might attract “customers” in the first place, however, it may not be what they really want in the longer term.

Conclusions

As always, this is an interesting addition to the canon of work on student experience. As we are in the process of analysing the results of our own internal Student Viewfinder Survey as well as looking at better ways of getting student evaluations, this may provide an indication of some of the questions we should be answering.

However, for me the key takeaways are:

  • the need to communicate expectations of how we expect students to learn independently
  • the linked need to make sure we explain how they will learn independently and take them to the point that they can do so successfully
  • the need to provide good transparent information on where we spend money
  • the need to support profession practice and for teaching staff to bring this into their teaching
  • the need to make sure we fund what students really need.

 

 

Guardian University Guide 2016

The latest Guardian University Guide has just come out.. This is the league table that doesn’t have any reference to research impact or intensity in its metrics, and so is the one used by universities who focus on being teaching led institutions.

A lot of emphasis is given to student experience, through the outcomes of the National Student Survey, and entry grades are dealt with twice – firstly in the details of entry tariff, and secondly in the measure of “value added”, which is an assessment of good degrees, but related to the entry grades of individual students. It’s notable that in previous years, Oxford had the highest value added score, so it is more a measure of good degrees than an assessment of supporting widening participation.

The headlines from this year’s guide are:

Cambridge remains in the top spot, with Oxford second

Coventry rises to 15th, placing it above some Russell Group universities, and making it teh highest placed post 92 university. How do they do it?

John Latham, vice-chancellor of Coventry University, says the university’s success is down to its focus on students’ needs. “We’re a modern university, but not just in the sense that we haven’t been around for as long – we’re very modern in our approach. We’re challenging the system. We’re bringing in new forms of pedagogy and listening to students.”

The university has three objectives: “teaching students well, making sure that students are listened to, and making sure they get good jobs at the end of their course,” says Ian Dunn, deputy vice-chancellor for student experience at Coventry.

Other big winners – Hull go up 21 places, Liverpool John Moores 22 places, De Montfort 20 places, Roehampton 22 places, Leeds Trinity 27 places, Sussex 24 places, Falmouth 22 places.

Going the other way – Northampton drop 17 places, Derby 23 places, UWE 30 places, UCLAN 18 places, Plymouth 19 places, Glyndwr 39 places.

Staffordshire University rise 7 places to 83rd – a third year of steady rises through the table, with better SSR results, improved value added and satisfaction with teaching.

guardian14-16

 

It’s all about the money, money, money

This week HESA have published the latest details on expenditure by universities, with details of this for 2013-14. As an institution we have just gone through our own internal budget meetings and so it’s interesting to see how the money is spent across the sector.

hesa13-14 expenditure

(from https://www.hesa.ac.uk/pr/3561-press-release-216)

Firstly, lets just consider the size of expenditure. For 2013-14, this was £29.4bn against income of £30.7bn, up from £25.8bn against income of £26.8bn in 2009-10.

As we go into election week, this is a reminder of the size of the sector and its growing importance to the economy, as well as the non-financial benefits of higher education that accrue to both the individual and to society.

The Times Higher reports on the data, identifying that the average surplus has gone up in the last year, and that the surpluses “support the view that the sector as a whole is financially sound”.

From that article, Phil McNaull, director of finance at the University of Edinburgh and deputy chair of the British Universities Finance Directors Group says

that surpluses should not lead people to think that things were now rosy.

“People look at organisations making a surplus and they think ‘profit’; they think you’re OK,” he says. “They don’t understand that you need to make surpluses to fund the future.”

And the future does hold challenges for the sector. Chief among them is the demand for capital spending, which is already evident on a walk around most university campuses: the growth in the number of shiny new buildings reflects how improving the student experience has become a priority amid an increasingly competitive recruitment environment.

I think we are all well aware of this, and that’s why the proposed new developments for our Stoke on Trent campus, on top of the work already carried out mean that we will be able to offer a great student experience in a city centre campus.

 

“Good” Degrees

We all know that gaining a good degree is important, perhaps more so now than ever. The increasingly consumerist approach by students might be enshrined in “what do I need to do to get a 2(i)?”, but in many cases this is also accompanied by a commitment to work that was perhaps less of a focus when I first studied. That might be also be attributable to the changing perceptions that students have of their higher education – seeing it as a transaction in which they engage to gain clearly defined outcomes, rather than the wider exploration that HE might have been considered to have been in some non-existent golden era.

A good degree is understood to be a benefit to the individual – it’s likely to help open doors in getting that first graduate job. It’s also beneficial for institutions for their students to be successful in this way: all university league tables include “good degrees” or some variant thereof in their analysis, and so the university that awards high numbers of good degrees can expect to reap the rewards in league table position. Of course there is also virtuous circle effect here – universities that are at the top of the tables may be the most selective, and able to recruit the students with the highest entry tariff scores in the anticipation that they will thrive. Other institutions will argue that they provide a greater amount of value added to students with lower entry grades.

In January, HESA published its first data release, which showed the range of degree classifications as follows:

071277_student_sfr210_1314_chart_9

72% of first degrees undertaken through full-time study in 2013/14 achieved first or upper second classifications compared to 54% of those undertaken through part-time study.

Now that more detailed data has become available through Hedi, then we can look to see how the different institutions perform on this measure – and whose outputs have changed significantly.

So here are the top 10 universities for awarding good degrees in 2013-14:

Institution 2013 % 1sts and 2(1)s 2014 % 1sts and 2(1)s difference
The University of Oxford 92% 92% 0%
Conservatoire for Dance and Drama 91% 91% 0%
Guildhall School of Music and Drama 87% 91% 4%
Central School of Speech and Drama 88% 88% 0%
The University of St Andrews 88% 88% 0%
The University of Cambridge 87% 88% 1%
University College London 87% 88% 1%
Royal Academy of Music 77% 88% 11%
Imperial College of Science, Technology and Medicine 88% 87% -1%
University of Durham 85% 87% 2%

And at the other end of the results….

Institution 2013 % 1sts and 2(1)s 2014 % 1sts and 2(1)s difference
London Metropolitan University 51% 55% 4%
University of Bedfordshire 48% 55% 7%
The University of East London 54% 54% 0%
Glynd?r University 54% 54% 0%
University College Birmingham 46% 54% 8%
University Campus Suffolk 56% 53% -3%
University of Wales Trinity Saint David 49% 51% 2%
SRUC 44% 51% 7%
The University of Buckingham 43% 51% 8%
The University of Sunderland 54% 50% -4%

For those of us who have an interest in league tables, then the interesting thing to look at will be those universities which have seen significant changes in the percentages of good degrees that they award. Hence we might look to see some league table gains (ceteris paribus) for the following:

Institution 2013 % 1sts and 2(1)s 2014 % 1sts and 2(1)s difference
Leeds Trinity University 56% 69% 13%
Royal Agricultural University 51% 63% 12%
Royal Academy of Music 77% 88% 11%
Bournemouth University 65% 76% 11%
Glasgow School of Art 59% 69% 10%
The University of Wolverhampton 50% 59% 9%

noting that Wolverhampton doesn’t engage in league tables.

The biggest drops are for:

Institution 2013 % 1sts and 2(1)s 2014 % 1sts and 2(1)s difference
University Campus Suffolk 56% 53% -3%
Writtle College 52% 49% -3%
Heythrop College 83% 79% -4%
Royal Conservatoire of Scotland 79% 75% -4%
The University of Sunderland 54% 50% -4%
The Royal Veterinary College 75% 66% -9%
University of the Highlands and Islands 71% 58% -13%

As well as looking at the percentages of good degrees, with a little bit of Heidi magic we can look to see how various student characteristics have an impact on outcomes. A particular interest of mine is attainment of students from a BME background, and in considering how any attainment gap can be reduced. This will form the subject of a later post.