Reflections on TEF

It’s been almost a week since TEF results were made public, and some of the predictable coverage, posturing, and agonising have occurred. Here are a few of my thoughts.

The importance of the written submission

In advance, we were all told that the written submission mattered, but at the same time, that the initial hypothesis that would be based purely on metrics was felt to be the factor that would determine classification. Looking at the result then plenty of universities have been awarded a TEF rating higher than their initial metrics would suggest. (This is personally pleasing since I wrote a significant amount of my previous employer’s submisison). The commentary provided by the TEF panel on each submission makes it clear that a written submission that demonstrated that an institution understood why it it missed benchmarks, could explain this in terms of contextual data, and show that activity was taking place to remediate the situation, then the higher award was possible.

The press didn’t understand what was being measured.

In advance of publication I was asked on Twitter whether anyone outside the sector was going to be interested in the results. Inevitably those papers who have a vested interest (by publishing their own university guides) or who have a reputation for being a TEF booster ( I’m looking at you here, The Times), were always going to publish something.

We inevitably saw articles reminding us that Southampton, LSE and Liverpool of the Russell Group had not performed as expected, and this this showed the shake up in the sector. Equally, there was criticism that the expected ranking or established order was not being replicated.

Any paper that publishes its own league table is going to be concerned if another form of ranking does not tally with their figures. But this is to misunderstand what TEF is – it’s about measures against benchmarks, not absolute performance, hence the difficulty for some unis in scoring above already high benchmarks, and for the press to create a simple story from a more complex narrative.

Universities love to celebrate

There was plenty of gold across those who felt they’d done well! This despite the rumblings and complaints in advance that the idea of three levels of ranking, like medals, was reductive and couldn’t possibly communicate the complexity of what a university does

How much does it matter to the sector?

TEF clearly matters to those in the sector, and will have implications for behaviours in the future. Universities already work hard to make sure that they optimise their data returns to HESA, that they get good scores in NSS by promoting and managing survey completion, and getting good scores on DLHE by managing those returns.

In future, these activities might drive performance management behaviours in universities even more than at present, with possible unforeseen consequences – courses and subject areas that perform poorly on a key metric may not longer be considered as viable, especially while TEF continues to be at institutional level.

For planning departments, then we can expect to see ever more sophisticated models of academic portfolio performance, and increased scrutiny of data returns.

((From the Modern Toss Work postcard set: http://ow.ly/hFV530cT60U )

The impact on fees has been temporarily removed, and with possible changes to funding in future (let’s face it, HE funding is back on the agenda after the recent General Election), then TEF as an instrument of marketisation through differential fees loses its power.

How much does it matter to the press?

For those in the press, TEF might just be a way to get easy headlines about perceived poor performance of established universities, while expressing shock at the performance of some FE colleges.

For the specialist press, commentariat and twitterati, TEF is a gift – something for the wonks to pore over and luxuriate in, in that quiet period at the end of an academic year.

How much does it matter to the punters?

For parents and potential students, TEF is just one more set of information to use, and has to be added to existing marketing collateral, multiple league tables, and guidance from schools and colleges. Without a clear explanation of what i being measured (particularly the issue of relative performance rather than absolute) then it’s not a straightforward measure, but just one more to add to the mix. Coupled with the Guardian University Guide concept of “value added” then it’s hardly surprising that potential students aren’t always clear about what might be on offer.

Finally, TEF may just be ignored if it does not provide the confirmation bias that people often use on making these kind of decisions. For example, I have a son who wants to study History in a year’s time. Both Staffordshire and Durham scored Silver. But I’m only going to recommend one of those.

You can bet though, that universities will shout about their TEF outcome (provided it was good) at this summer’s open days.

A New Home

My previous blog has now been migrated to this site (actually this has been a mirror site for quite a long time, but never used beyond that).

It’s time to start writing new content, and building up stats again – but just to show how much the old blog was read, here’s a peek at the final usage stats:

 

And that “best ever” – that was on the day I wrote about the Guardian University Guide in 2014. In fact most of the high traffic posts have been about league tables, although one of the latest posts on “Does UK HE have a Retention Problem” has been pretty popular.

Does UK HE have a retention problem?

Last night I attended an event at King’s College London, hosted by UPP Foundation and Wonkhe, looking at retention issues in UK higher education. The format was a series of initial thoughts from each of 5 panel members, followed by a lively discussion, showing the importance of this topic.

wonkheupp

Richard Brabner

Richard introduced this as the second of three workshops on student journey. He pointed out that  HESA stats on  non-continuation show that this is getting worse, and especially for students from disadvantaged backgrounds. he reminded the audience that in light of this Les Ebdon of OFFA expects next access agreements to focus on retention.

Liz Thomas

Liz stared by explaining that UK figures for retention are in fact much better than most European countries. In those countries with free tuition, then there was a feeling that getting students out of the system was part of the quality system. In awold domintae dby fees and student loans, then this attitude cannot prevail. We admit students to our courses and so we have obligation to help them succeed. So we do have an issue around student success and retention, in particular around differential levels of success, retention and  plus employment outcomes when we consider BME, WP and other factors.

From the HEA/Paul Hamlyn What Works  project it was clear that learning and teaching is critical to student success and retention by building a sense of belonging in the academic sphere. This goes beyond curriculum, but is about the whole institution recognising that it needs to make students successful, and needs to consider role and contribution of all staff.

Sorana Viera

Sorana of the NUS believes that the UK HE does have a retention problem for some groups of students and suggested that an unforeseen consequence of TEF is that game-playing to satisfy the metrics could exacerbate the situation.  The NUS view was that the rushed nature of TEF potentially leaves dangerous holes. Since the key metrics that universities can impact is non continuation then all eyes should be on retention.
Universities should  invest more in those supporting activities that are evidence based, and Soranna cited the What Works project as an example of this. If evidence is presented in accessible ways, then NUS will champion it.

In particular, the impact for commuting students was raised – these are students with financial pressures, family and work commitments, who may have chosen to study at a local university which may not be the right university for them.

Alex Proudfoot

Alex showed that some of the issues for alternative providers are quite different. Students are much more likely to be from a  BME background, or be aged over 30, so these providers are dealing with very different cohorts of students.

A focus for alternative providers was on delivering courses that focus on employability by creating industry links and ultimately an industry community within the college where staff and students might collaborate on projects outside of class.

In terms of pathways and transitions into HE, students who go through the same provider from level 2 and 3 have better retention at HE levels.

For students with low entry qualifications, then classes on study skills are a compulsory part of curriculum, rather than be in an additional optional choice for the student

Ross Renton
Ross highlighted the huge differences in retention and success based on ethnicity. he emphasised the need to develop an understanding who is joining your university or couerse, and developing a relationship with them before they arrive or join the course.

At Hertfordshire they had previously targeted POLAR quintile 1&2 students on entry, and provided peer mentoring plus other additional activity,tailored to each student. Retention figures improved by 43% for these students, and DLHE shows better rate of graduate employment. This intensive personalisation works but is expensive

Ross also highlighted the fact that problems need to be owned by everyone – it’s not a matter of sending a student off to some student hub, but all academic staff need to take ownership. There is also a need to systemise personal tutoring, so that key and meaningful conversations take place at the right times for all students, including at all transition periods, long holidays etc.

In the future Ross saw some risk in being overly focused on the use of metrics and analytics – this is still about people working with people.

Panel Q&A

Key points in the Q&A session were around:

  • How do we support hourly paid lecturers- not delivering HE on the cheap, but supporting the right staff properly
  • The current retention metrics don’t allow for students to step out of HE with interim quals in a flexible framework
  • Staff also need to feel that they belong, so need to consider institutional culture.
    How do you support students through whole institution approach.
  • How can we build success in L&T including retention and success into reward and recognition for staff?
  • How do we making the campus more “sticky” for students living at home? The research on commuting students suggests that these students feel the campus is not for them and they feel marginalised and invisible. Details in prospectus will cover accommodation but not local travel. Universities were often not set up to  support these students, expecting them to be in 4-5 days a week.
  • Tax burden for those who drop out but have student debt – ethics and who should pay? 1 yr of study should be seen as a success
  • Can we use analytics to create better informed interventions as otherwise it is difficult to personalise in mass system without good real time information.

Takeaways

Certain key factors stand out:

  • The need to look carefully at differential retention and success, and to ensure that TEF does not drive perverse behaviours
  • The opportunities to use better analytics to personalise student support
  • The need for rigorous and meaningful personal tutor systems
  • A pressing need to understand how a sticky campus can support commuting students and meeting their specific needs.

 

EdTech futures in the Connected University

Digital technology is bringing huge changes to all industries and sectors, not least higher education. It isn’t the future, it’s the present. This article summarises three recent publications, firstly the annual NMC Horizon report that I’ve previously blogged on here; a talk by Steve Wheeler, the keynote speaker at last years Learning’s and Teaching Conference, and finally a piece by Eric Stoller, who will be delivering a keynote at this year’s conference.

Firstly let’s look at this year’s NMC Horizon report. This is categorised into:

  • Key Trends Accelerating Higher Education Technology AdoptionNMC 2017-1
  • Significant Challenges Impeding Higher Education Technology AdoptionNMC2017-2
  • Important Developments in Technology for Higher EducationNMC2017-3

Usefully NMC have provided a summary of their predictions from previous years, and it’s worth noting that not all of their predictions come to pass; equally some remain on the radar for a number of years. Audrey Watters has previously provided a critique of NMC for those who’d like a different view.

Nonetheless, this is a useful starting point, and we can map our own activities against all of  the 18 trends/challenges/developments, but here I’ll focus on a few.

As we walk around this campus (and many others in the UK), we can see how learning spaces are being transformed to allow different ways of learning to take place.

We have a major focus on improving staff and student digital capabilities, recognising that this will help drive innovation, as well as improve employability prospects of our graduates.

The achievement gap is one I have blogged about previously – this continues to be a difficult multi faceted probelm. Technology will not provide all the answers, but may help level the playing field in some areas.

The possibility of a very different LMS in the future is tantalising. We know that current systems such as BlackBoard and Canvas are very good at managing learners and resources – making sure the right information is provided to the right people at the right time. Changes to the way in which staff and students collaborate through co-creation and sharing could render this form of LMS redundant in future.

Away from the NMC report, Steve Wheeler of Plymouth University presented on what’s hot and what’s not in learning technology. The video is well worth watching.

Steve identifies a huge range of technologies that will likely have an impact: voice controlled interfaces; gestural computing, the Internet of Things (pervasive computing); wearable technologies;artificial intelligence; touch surfaces for multitouch multiusers; wearable tech; virtual presence; immersive tech such as Oculus rift for VR and AR; 3D printers and maker spaces. The list goes on.

Steve identified three key elements for the future:

  • Very social
  • Very personal
  • Very mobile

and this needs to be underpinned with developing digital literacy, particularly when wading through alt-facts and fake news. Our students need to learn how to check the veracity and relevance of materials.

Steve postulates that until the development of the PC or web, everything was teacher centred. Technology allows us to become learner-centred, but have we adjusted enough to being learner led?

This should impact the way in which we assess- education and training must go from recursive to discursive, no longer repeating or regurgitating materials from the teacher, but through a  discursive approach developing problem solving skills etc.

  • The changes are
  • Analogue to digital
  • Closed to open
  • Tehthered to mobile
  • Standardised to personalised
  • Isolated to connected

 

Finally, a new blog post from Eric Stoller looks at “Student Success, Retention, and Employability – Getting Digital in a High Tech, High Touch Environment”.

Eric identifies that the more engaged a student is during their university experience, the more successful they will be. Digital offers us the opportunity to increase the channels through which we communicate with and engage with our students.

Eric (as well as Steve above, and the NMC report) highlights the importance of digital capability, particularly through the lens of employability. Students need to graduate with the digital skills they will use in the workplace, not just those that they use to complete a university course. Interestingly Eric also highlights the need to teach students about their digital presence and identity.

Finally he refers to the existence of a digital divide (again identified by NMC as digital equity) – “If your university is students first, that means all students”. This a a challenge that focusing on providing the right kit, but more importantly developing the right skills an behaviours means that we can get all staff and students to engage in a connected digital future.

Last year we enjoyed Steve Wheeler’s presentation at our Learning and Teaching Conference – I can’t wait to hear Eric Stoller later this year at the same event.

 

 

 

My Social Media Profile

As a university we are committed to becoming the Connected University, and are making great strides in changing our approach to learning and teaching, to our campus transformation and to the way in which we run the business, all enabled by digital tools and technologies.

On an individual level, we can reasonably expect colleagues to embrace aspects of digital technology to enhance their work, to change the way in which they communicate with each other, with our students and with other stakeholders.

When we look at the amount of content being created, and the amount of communication taking place in just one minute, we can’t avoid being engaged with social media:

16_domo_data-never-sleeps-4

(from https://www.domo.com/blog/data-never-sleeps-4-0/) 

At last year’s Learning and Teaching Conference, we asked attendees to make a pledge of what they might do differenlty, based on what they were taking away from the conference. On reviewing these, it was clear that lots of colleagues wanted to dip their toe into the world of social media, or if they were already using such tools, explore and expand further their use.

This short article is a reflection of how I use social media. I’m not suggesting this is the only way, and I’m sure I can identify gaps in my own practice.

As a starting point, it’s worth looking at the work of David White, who proposes that the term “digital native” has had its day, and that we shouldn’t decide on a person’s digital literacy based solely on age, but in terns of how comfortable they are in using technology. White’s model of looking at digital residents vs visitors is a useful starting point for assessing our own digital skills (in addition to the various diagnostic tests we can use).

mgh resident visitor

Through this approach I can map my own own digital profile, which in itself raises a number of questions: where do I live in the digital world? Can I be found? Can I be found in multiple channels? How do I manage a level of authenticity? How do I moderate my voice between different channels and different audiences?

My social media profile then is primarily found in:

  • Twitter
  • Facebook
  • Strava
  • Flickr
  • WordPress

Twitter is my most work-related tool, although not everything posted here is work-related. As part of building an authentic voice, it’s important to reveal enough of yourself as a person, your other and commitments, to allow followers to gain a greater insight into you. For example, following a recent accident, the message on Twitter from a nationally known HE commentator was simply “How’s the bike?”.

Through Twitter, I’ve developed a really useful network outside the University, often with people who are influential in the sector, but who I wouldn’t meet otherwise. It means that attending meeting across the country, more often than not, you already know a lot about the people you will be meeting. And last year’s keynote speakers for our L&T conference as well as this year’s came from people I’d got to know through Twitter.

We all know of the danger of social media becoming an echo chamber – it’s good to follow people who you don’t agree with on all things, otherwise we are missing the benefits of academic debate.

Facebook for me is purely social. I do follow feeds from the University and from various schools an departments. My posts here are almost never work related and hopefully the privacy settings are such that I can maintain a more private profile here, which focuses on family, friends and hobbies.

Strava i is totally social – only look at this is you want to know how far and how slowly I ride a bike.

Flickr is for serious photography – quick snaps may appear on Facebook or Strava, anything that require any amount of editing will end up on Flickr.

WordPress is the software that powers many of the world’s blogs. This blog itself is a WordPress installation on the university system. I have a second site  as a backup, and where I can experiment with some additional WordPress tools and integrations. I’ve written before about why I write a blog – it provides a means to communicate in longer form than Twitter, and to provide my personal analysis of changes in the HE sector, both for internal and external consumption

There are a whole load of tools I don’t use – Snapchat and Instagram come to mind immediately. If nothing else, I’m not a great fan of the #artificialhashtag. However, institutionally we do need to be on top of these – these are the tools our students are using.

Finally I’ve mapped a number of other tools – WhatsApp, Skype for Business, FaceTime, FB Messenger – these are my comms channels in addition to my 2 email accounts.

There’s a lot to keep on top of!

 

 

Guardian University Guide 2017

The second big university league table of the year, the Guardian University Guide, was published today, one which the compilers say is the most student friendly,as it focuses on subject level scores in more detail, and measures things that are of importance to students. In other words, research is not a part of the table.

“The methodology focuses on subject-level league tables, ranking institutions that provide each subject area, according to their relevant statistics.

To ensure that all comparisons are as valid as possible, we ask each institution which of their students should be counted in which subject so that they will only be compared to students taking similar subjects at other universities.

Eight statistical measures are employed to approximate a university’s performance in teaching each subject. Measures relate to both input – for example, expenditure by the university on its students – and output – for example, the probability of a graduate finding a graduate-level job. The measures are knitted together to get a Guardian score, against which institutions are ranked.

A lot of emphasis is given to student experience, through the outcomes of the National Student Survey, and entry grades are dealt with twice – firstly in the details of entry tariff, and secondly in the measure of “value added”, which is an assessment of good degrees, but related to the entry grades of individual students.

The top 4 places are unchanged – Cambridge, Oxford, St Andrews and Surrey. The entrant into the top 5 is Loughborough.

The big winners this year are: Manchester Met, Northumbria City, Bradford, Anglia Ruskin, Derby, Liverpool Hope, Sunderland.

While going down are:Liverpool John Moores, Queen Margaret, Brunel, Brighton, Cumbria ,Birmingham City.

Staffordshire University have pleasingly gone up 14 places to 69th.

guardian2017

 

 

 

 

 

Normal service is resumed?

After a quiet time on the wonk front, last week saw the publication of the White Paper and two new reports on employability of STEM graduates, announcement of a Higher education bill in the Queen’s speech, and the launch of the technical consultation on the Teaching Excellence Framework, not forgetting the previous week’s plans for consulting on the future of DLHE. Anyone would think that HE wonks had been twiddling their thumbs for a while, with nothing to critique or criticise. For a really good set of resources on this, it is worth looking at WonkHE.

The White Paper contained few real surprises – changes to quality arrangements, making it easier for new entrants to the market, the introduction of a teaching excellence framework, changes to the landscape and research support – all were previously consulted on in the previous Green Paper. Overall, the sector has not been unreservedly supportive, but even with a small parliamentary majority, the bill is likely to become law, and so we need to learn how we can work as well as possible within this revised landscape.

Overall, the changes are to drive further the marketisation of higher education – no matter how we might suggest that HE does not operate as a fully open market, the government is wedded to the idea that increasing competition will drive up quality. Hence, the idea that new entrants  – “challenger” institutions will be able to provide competition to existing incumbents. Similarly, the teaching excellence framework is touted as providing more information to prospective students, hence helping them to make more informed decisions. There is, of course, little evidence that students make decisions purely on data, and for many students, there may not be a free choice of where they study, based on financial circumstances, and family or work commitments.

Nonetheless, we will have a TEF, and so it’s important to understand what will drive success in this, so that we can get the best possible outcome which reflects our performance. One piece of good news is that the government did listen to the sector in terms of timing of implementation, even if concerns about the metrics to be used fell upon stony ground.

From the technical consultation, we know that the following principles should underpin TEF:

  • keep bureaucracy and burden to a minimum
  • be voluntary, allowing providers to consider the benefits and costs of applying before deciding whether or not they wish to
  • allow for diverse forms of excellence to be identified and recognised
  • support rather than constrain creativity and innovation
  • respect institutional autonomy
  • be based on peer assessment
  • be robust and transparent
  • result in clear judgements about excellence for students, employers and other stakeholders
  • avoid driving perverse or unintended behaviours in the pursuit of demonstrating excellence
  • be sufficiently flexible to allow for further development as the TEF evolves.

From year 2 of TEF, institutions who choose to be assessed can be judged to meet one of three outcomes: Meets Expectations, Excellent or Outstanding. To get to this, we would be assessed on: teaching quality, learning environment, student outcomes and learning gain.

And the part we need to be mindful of is how this will be assessed.

Teaching quality will be based on questions 1- 9 of the National Student Survey (teaching and assessment and feedback). Learning environment will be judged on questions 10-12 of the NSS (academic support) and non-continuation data from HESA, while outcomes will be assessed by the results of DLHE.

This does look remarkably like a league table, and so institutions will work harder than ever to make sure that their NSS results and DLHE figures show outcomes in the best possible light.

In addition to the data, providers will provide a written submission of no more than 15 pages. This is where we will be able to provide more context to what we do – examples cited in the document discuss: use of student surveys, collecting and responding to module feedback, staff development activities, timeliness of feedback, use of employers on validation panels, levels of contact time and independent study.

This is going to be a lot to cover in 15 pages, so it will be key for institutions to have their policies really clearly defined in terms of how their various mechanisms work, and how they can be shown to improve student experience and outcomes.

Our recent work on changing module evaluation processes and observation of teaching, and our review of quality processes will put us in a good position to explain how we manage our academic delivery to provide the best experience for students. We will clearly need to focus more on some of our student survey scores, and get to the bottom of why we have such a wide variety of reported experiences.

Next steps for us will be:  how we review our student survey outcomes; how we deliver our new employability strategy; how we ensure that we use the information from module evaluations and teaching observations to optimise student success, and how we review the performances of all of our courses.

There will no doubt be an ongoing resistance to TEF – the metrics chosen are still not ideal, and when we move to looking at subject level analysis, then there will be concerns regarding reliability of data – but this is a system we are going to have to work with. It would make sense to make sure we are best prepared as we can be.

Complete University Guide 2017

The first of the major University league tables, the Complete University Guide, is published today.

This table uses metrics  on ten measures: Student Satisfaction, Research Quality, Research Intensity, Entry Standards, Student: Staff Ratio; Spending on Academic Services; Spending on Student Facilities; Good Honours degrees achieved; Graduate Prospects and Completion.

From the CUG press release:

Dr Bernard Kingston, principal author of TheCompleteUniversityGuide.co.uk, said: “There is a considerable degree of stability at the upper end of the league table this year. While dramatic changes may be newsworthy, this stability indicates that the rankings are robust and credible for young people seeking a university place – our primary purpose.”
This year’s release sees TheCompleteUniversityGuide.co.uk publish a number of new rankings. Alongside new subject tables for Creative Writing, Forensic Science and Occupational Therapy, there is now a Creative and Performance Arts table, containing 14 institutions that do not feature in the Main Table.
Dr Kingston said: “We have simultaneously released a survey of universities’ relative success in resolving student complaints. This shows significant variations between universities and is an important source of information for prospective students who what to know that their complaints will be effectively resolved.” (See attached Press Release and Table).
“Last year’s Higher Education Green Paper, Higher education: teaching excellence, social mobility and student choice, stated that applicants need access to robust, timely and objective information, based on criteria that are straightforward and easily understood.

So the top ten are:

2017

Position

2016

Position

Change Institution
1 -1 0 Cambridge
2 -2 0 Oxford
3 -3 0 London School of Economics
4 -4 0 Imperial College London
5 -5 0 St Andrews
6 -5 -1 Durham
7 -11 4 Loughborough
8 -7 -1 Warwick
9 -9 0 Lancaster
10 -13 3 University College London

Not really any surprises there. Staffordshire falls 6 places to 109th.

What is always of more interest are the big movers, both up and down, and the identification through reading the individual subject tables to see why these changes have happened.

So this year’s big winners are:

  • Manchester Met – up 16
  • Harper Adams – up 14
  • Buckingham – up 14
  • Liverpool Hope – up 14
  • Sunderland – up 14
  • Falmouth – up 12
  • Winchester – up 12
  • Edge Hill – up 11
  • Middlesex – up 11

At the other end we have

  • Oxford Brookes – down 11
  • St Mark and St John – down 12
  • Brighton – down 14
  • Queen Margaret – down 15
  • Royal Agricultural University – down 17
  • Arts University Bournemouth – down 19

The section on complaints and their resolution will be of interest to academic registrars. Over a 3 year period, the number of completion of procedure letters issued after exhausting internal complaints process per 1000 students is ranked. The ranking here does not follow any meaningful pattern – it might be assumed that students at one type of university are more likely to use a complaints procedure than others, or that part of the sector would be better at dealing with complaints but this is clearly not the case. Pleasingly, the figure quoted for Staffordshire is considerably better than for some. Whether this data is of any meaningful us to prospective students is debatable.

 

Earnings by Course and University

As revealed in legislation last year, the government has been keen to see the impact of subject studied, and where, on the earnings of graduates.. The initial research has now been carried out by the Institute for Fiscal Studies, and looks at data that is more long term that the current DLHE data, and crucially considers student loan repayments and  tax returns.

Graduates from richer family backgrounds earn significantly more after graduation than their poorer counterparts, even after completing the same degrees from the same universities. This is one of many findings in new research published today which looks at the link between earnings and students’ background, degree subject and university attended

Having carried out the research, some of the the findings could be considered as underwhelming:

  • students from wealthy backgrounds out-earn others, when studying the same subject at the same institution
  • graduates in creative arts earn less than others.

Inevitably the reaction from some places has been to roll out the “more means worse” arguments, for instance here in the Daily Telegraph, where Fraser Nelson writes:

If a book is ever written about the mis-selling of higher education, it might start with such adverts. There’s no doubt that doctors and lawyers earn a bomb; no doubt that an Oxbridge degree opens many gilded doors. But studying urban dance at Peckham University or media studies at the University of Scunthorpe is another story entirely.

Yes, the average graduate premium may be generous. But today, all too many ropey institutions hide behind the word “university” – offering dismal courses that serve neither students nor society. And by the time the students realise that they’ve been sold a pup, it’s too late.

A more detailed reading of the paper would reveal that although there may be 23 institutions where the median salary for male graduates is lower than for non graduates (as shown in this almost indecipherable graph), the authors state:

ifs1

At the other end of the spectrum, there were some institutions (23 for men and 9 for women) where the median graduate earnings were less than those of the median non  graduate ten years on. It is important to put this in some context though. Many English higher education institutions draw a signicant majority of their students from people living in their own region. Given regional differences in average wages, some very locally focused institutions may struggle to produce graduates whose wages outpace England-wide earnings, which include those living in London etc. To illustrate regional differences, employment rates in the period under consideration varied between66% in the North East and 75% in the East of England, and data from the Annual Survey of Hoursand Earnings suggests that average full-time earnings for males were approximately 48% higher
in London than in Northern Ireland, and around 34% higher for females. Regional differences are therefore important and we take them into account in our analysis of graduates’ earnings.

 

More interestingly though is how this data might be used in the future. In  this paper, the authors have not published results against all named institutions, although most of the Russell Group universities are named. In future, the intention would be to show this. One argument could be to use the data to allow differential fees, or to have differential RAB charges by subject or institution. Alternatively the information could be used to provide better student information and to challenge policies on social mobility. A recent article in the Times Higher looks at the different views from across the sector.

A clear message for us however might be to continue with our focus on developing students’ employability skills and being prepared to make sure that these skills which might currently be missing, are deeply embedded into courses or into extra-curricular activities. For instance, we can do more to develop numeracy and digital capability skills, by understanding exactly what it is that potential employers want to see in the graduates that they employ.

More challenging is around the issue of social capital. As a university that has at its heart a belief in education as a transformational activity, and a commitment to widening participation, we might do well to understand more how we can help our students develop social and cultural capital – without this they will always find t more difficult than those for whom university was an expected rite of passage. It’s very likely that for many students- especially those who are local or who commute in daily – that their sense of bonding capital is high. The corollary is that the level of bridging capital – that which they need to develop new networks – is lower than for students with different backgrounds. Identifying activities that will help our students develop this could be key. Some possible areas are placements, internships, and cross-disciplinary projects, where students have to work on real world problems but with student from other subjects, to pull them out of their comfort zone.

Over the next few weeks it will be instructive to see how politicians react to this new data, and from this for us to identify specifically what we should do to respond.

 

 

 

 

HEFCE Revised Operating Model for Quality Assessment

Last week HEFCE published their revised operating model for quality assessment. This is based on the responses from the sector consultation that took place last year, and where we, and many other universities, identified areas that were of concern to us. Some of these have been addressed. However, this is also part of the current sectoral land grab to have the responsibility for qualit; at the same time as publishing, HEFCE has put out to tender various aspects of its quality work.

Key points to note from the revised operating model:

  • “future quality assessment arrangements should seek to encourage innovation in learning and teaching, rather than driving providers towards risk-averse activities and homogenised provision.”
  • “approach for implementation is therefore designed to be proportionate, risk-based and grounded in the context of each individual provider and its students”
  • a set of baseline regulatory requirements will still based on parts of the existing quality code and the framework fr higher education qualifications
  • fore new entrants there will be a gateway process followed by a developmental period of enhanced scrutiny and support
  • for established providers, a review of their own review processes, followed by a data-based Annual Provider Review and a revised periodic review visit

Some common areas of contention from responses from the sector were: comparability of standards; a potential national register of external examiners, and the roe of governing bodies.

A large section of the document covers comparability of standards, and classification algorithms used.The document states that when reviewing the original proposals:

Arguments mobilised against the proposals included:
• an opposition in principle to the funding bodies acting in an area where institutional autonomy is prized
• a view that there was no particular problem to be resolved, or that the specific proposals would not resolve whatever problems might exist
• a series of more practical concerns relating to increasing the burden on external examiners, thereby disincentivising the people on whom the successful operation of the system depends.

But that “student and PSRB respondents were much clearer that modernisation in this area was important, with some suggesting that the proposed reforms did not go far enough”

HEFCE have moved away from the proposal for a national register of external examiners, but talk instead of training of examiners to ensure that they are able to check comparability of standards – there is still a worry that good degree rates are rising and that these may not be defensible

The role of governors was an area that may universities had plenty to say about in the response to consultation, where it was felt that governing bodies may not be best placed to make direct judgements about academic quality. Again, HEFCE have clarified their expectation:

The role of the governing body would be to receive reports and challenge assurances from within the institution. It should not be drawn into quality management activities itself. We recognise the predominant role of senates and academic boards (or equivalent) in academic governance, and the responsibility of the accountable officer and senior executive team, and would expect an individual governing body to be clear about the formal relationships between the elements of the governance arrangements in its own institutional context.

There’s plenty more to digest. As always, WonkHe have a guide to how the new system will work, written by Louisa Darian.

What will be interesting now is the transitional arrangements and the pilots to be run during 2016-17.