Reflections on TEF

It’s been almost a week since TEF results were made public, and some of the predictable coverage, posturing, and agonising have occurred. Here are a few of my thoughts.

The importance of the written submission

In advance, we were all told that the written submission mattered, but at the same time, that the initial hypothesis that would be based purely on metrics was felt to be the factor that would determine classification. Looking at the result then plenty of universities have been awarded a TEF rating higher than their initial metrics would suggest. (This is personally pleasing since I wrote a significant amount of my previous employer’s submisison). The commentary provided by the TEF panel on each submission makes it clear that a written submission that demonstrated that an institution understood why it it missed benchmarks, could explain this in terms of contextual data, and show that activity was taking place to remediate the situation, then the higher award was possible.

The press didn’t understand what was being measured.

In advance of publication I was asked on Twitter whether anyone outside the sector was going to be interested in the results. Inevitably those papers who have a vested interest (by publishing their own university guides) or who have a reputation for being a TEF booster ( I’m looking at you here, The Times), were always going to publish something.

We inevitably saw articles reminding us that Southampton, LSE and Liverpool of the Russell Group had not performed as expected, and this this showed the shake up in the sector. Equally, there was criticism that the expected ranking or established order was not being replicated.

Any paper that publishes its own league table is going to be concerned if another form of ranking does not tally with their figures. But this is to misunderstand what TEF is – it’s about measures against benchmarks, not absolute performance, hence the difficulty for some unis in scoring above already high benchmarks, and for the press to create a simple story from a more complex narrative.

Universities love to celebrate

There was plenty of gold across those who felt they’d done well! This despite the rumblings and complaints in advance that the idea of three levels of ranking, like medals, was reductive and couldn’t possibly communicate the complexity of what a university does

How much does it matter to the sector?

TEF clearly matters to those in the sector, and will have implications for behaviours in the future. Universities already work hard to make sure that they optimise their data returns to HESA, that they get good scores in NSS by promoting and managing survey completion, and getting good scores on DLHE by managing those returns.

In future, these activities might drive performance management behaviours in universities even more than at present, with possible unforeseen consequences – courses and subject areas that perform poorly on a key metric may not longer be considered as viable, especially while TEF continues to be at institutional level.

For planning departments, then we can expect to see ever more sophisticated models of academic portfolio performance, and increased scrutiny of data returns.

((From the Modern Toss Work postcard set: http://ow.ly/hFV530cT60U )

The impact on fees has been temporarily removed, and with possible changes to funding in future (let’s face it, HE funding is back on the agenda after the recent General Election), then TEF as an instrument of marketisation through differential fees loses its power.

How much does it matter to the press?

For those in the press, TEF might just be a way to get easy headlines about perceived poor performance of established universities, while expressing shock at the performance of some FE colleges.

For the specialist press, commentariat and twitterati, TEF is a gift – something for the wonks to pore over and luxuriate in, in that quiet period at the end of an academic year.

How much does it matter to the punters?

For parents and potential students, TEF is just one more set of information to use, and has to be added to existing marketing collateral, multiple league tables, and guidance from schools and colleges. Without a clear explanation of what i being measured (particularly the issue of relative performance rather than absolute) then it’s not a straightforward measure, but just one more to add to the mix. Coupled with the Guardian University Guide concept of “value added” then it’s hardly surprising that potential students aren’t always clear about what might be on offer.

Finally, TEF may just be ignored if it does not provide the confirmation bias that people often use on making these kind of decisions. For example, I have a son who wants to study History in a year’s time. Both Staffordshire and Durham scored Silver. But I’m only going to recommend one of those.

You can bet though, that universities will shout about their TEF outcome (provided it was good) at this summer’s open days.