Learning and Teaching Conference 2016

Last week we held our annual learning and teaching conference, around the theme of Digital Capability. What a success!

We had more attendees than ever before, and a real buzz around the building, as people moved between keynote lectures, the parallel breakout sessions and the fringe stands.

The day started off with a welcome from our VC, Prof Liz Barnes before our first keynote speaker, Helen Beetham introduced the subject for the day with her talk on “Digital Capability, Beyond Digital Capability”

28My key takeaway from the talk was on the need to develop digital capability to provide a “capacity to thrive” and that students will awlays want what we can uniquely offer, namely:

  • Learning relationships
  • Sense of belonging
  • Security – walled gardens with pathways out
  • Credibility
  • Distinctiveness
  • Specialism
  • Reputation

Following this, our first breakout sessions looked at: Innovative use of technology to enhance teaching; Digital support for student learning; Digital insights to improve learning, and Digital identity and capability. These provided a chance for staff from across the institution to showcase their work, and prompt discussion of how digital tools can be used to improve how we deliver our courses.

Explorw-2016-001

Explorw-2016-054

Lunchtime saw us run a fringe event for the first time – a chance to talk to university suppliers such as BlackBoard, lynda.com, and to some of our support teams.

Explorw-2016-038 Explorw-2016-035 Explorw-2016-032

The keynote after lunch was by Steve Wheeler, on Learning in the Age of Remix

Explorw-2016-048

Steve challenged us about how digital tools change the way we teach, how physical and digital spaces are blurring in a hyper-connected world, how technology is not a silver bullet – it should be used wisely or not at all, and most importanty reminded us th it’s a fabulous time to be an academic.

Following further breakout sessions, we returned for final plenary and Q&A session

97

So what next?

Some great feedback has already appeared across social media using the hashtag #StaffsLT16

If you attended, you’ll be asked to complete a qualtrics survey

All of the presentations are now available on our conference blog here. Videos of the keynote presentations will be available as soon they have gone through post-production and editing

All attendees were asked to fill in a pledge card asking, what will you do differently? We’ll be sending these back to you in due course as a reminder, and also so that we can provide the development and support you need.

We’re already looking at what we can do to improve the conference expereicne further,  and how we start to  plan next year’s event.

We’ll be building our digital strategy to help all our staff and students get the most out of the technology we have to make us the Connected University – this conference was just the starting point – it’s going to be an exciting year!

 

Guardian University Guide 2017

The second big university league table of the year, the Guardian University Guide, was published today, one which the compilers say is the most student friendly,as it focuses on subject level scores in more detail, and measures things that are of importance to students. In other words, research is not a part of the table.

“The methodology focuses on subject-level league tables, ranking institutions that provide each subject area, according to their relevant statistics.

To ensure that all comparisons are as valid as possible, we ask each institution which of their students should be counted in which subject so that they will only be compared to students taking similar subjects at other universities.

Eight statistical measures are employed to approximate a university’s performance in teaching each subject. Measures relate to both input – for example, expenditure by the university on its students – and output – for example, the probability of a graduate finding a graduate-level job. The measures are knitted together to get a Guardian score, against which institutions are ranked.

A lot of emphasis is given to student experience, through the outcomes of the National Student Survey, and entry grades are dealt with twice – firstly in the details of entry tariff, and secondly in the measure of “value added”, which is an assessment of good degrees, but related to the entry grades of individual students.

The top 4 places are unchanged – Cambridge, Oxford, St Andrews and Surrey. The entrant into the top 5 is Loughborough.

The big winners this year are: Manchester Met, Northumbria City, Bradford, Anglia Ruskin, Derby, Liverpool Hope, Sunderland.

While going down are:Liverpool John Moores, Queen Margaret, Brunel, Brighton, Cumbria ,Birmingham City.

Staffordshire University have pleasingly gone up 14 places to 69th.

guardian2017

 

 

 

 

 

Normal service is resumed?

After a quiet time on the wonk front, last week saw the publication of the White Paper and two new reports on employability of STEM graduates, announcement of a Higher education bill in the Queen’s speech, and the launch of the technical consultation on the Teaching Excellence Framework, not forgetting the previous week’s plans for consulting on the future of DLHE. Anyone would think that HE wonks had been twiddling their thumbs for a while, with nothing to critique or criticise. For a really good set of resources on this, it is worth looking at WonkHE.

The White Paper contained few real surprises – changes to quality arrangements, making it easier for new entrants to the market, the introduction of a teaching excellence framework, changes to the landscape and research support – all were previously consulted on in the previous Green Paper. Overall, the sector has not been unreservedly supportive, but even with a small parliamentary majority, the bill is likely to become law, and so we need to learn how we can work as well as possible within this revised landscape.

Overall, the changes are to drive further the marketisation of higher education – no matter how we might suggest that HE does not operate as a fully open market, the government is wedded to the idea that increasing competition will drive up quality. Hence, the idea that new entrants  – “challenger” institutions will be able to provide competition to existing incumbents. Similarly, the teaching excellence framework is touted as providing more information to prospective students, hence helping them to make more informed decisions. There is, of course, little evidence that students make decisions purely on data, and for many students, there may not be a free choice of where they study, based on financial circumstances, and family or work commitments.

Nonetheless, we will have a TEF, and so it’s important to understand what will drive success in this, so that we can get the best possible outcome which reflects our performance. One piece of good news is that the government did listen to the sector in terms of timing of implementation, even if concerns about the metrics to be used fell upon stony ground.

From the technical consultation, we know that the following principles should underpin TEF:

  • keep bureaucracy and burden to a minimum
  • be voluntary, allowing providers to consider the benefits and costs of applying before deciding whether or not they wish to
  • allow for diverse forms of excellence to be identified and recognised
  • support rather than constrain creativity and innovation
  • respect institutional autonomy
  • be based on peer assessment
  • be robust and transparent
  • result in clear judgements about excellence for students, employers and other stakeholders
  • avoid driving perverse or unintended behaviours in the pursuit of demonstrating excellence
  • be sufficiently flexible to allow for further development as the TEF evolves.

From year 2 of TEF, institutions who choose to be assessed can be judged to meet one of three outcomes: Meets Expectations, Excellent or Outstanding. To get to this, we would be assessed on: teaching quality, learning environment, student outcomes and learning gain.

And the part we need to be mindful of is how this will be assessed.

Teaching quality will be based on questions 1- 9 of the National Student Survey (teaching and assessment and feedback). Learning environment will be judged on questions 10-12 of the NSS (academic support) and non-continuation data from HESA, while outcomes will be assessed by the results of DLHE.

This does look remarkably like a league table, and so institutions will work harder than ever to make sure that their NSS results and DLHE figures show outcomes in the best possible light.

In addition to the data, providers will provide a written submission of no more than 15 pages. This is where we will be able to provide more context to what we do – examples cited in the document discuss: use of student surveys, collecting and responding to module feedback, staff development activities, timeliness of feedback, use of employers on validation panels, levels of contact time and independent study.

This is going to be a lot to cover in 15 pages, so it will be key for institutions to have their policies really clearly defined in terms of how their various mechanisms work, and how they can be shown to improve student experience and outcomes.

Our recent work on changing module evaluation processes and observation of teaching, and our review of quality processes will put us in a good position to explain how we manage our academic delivery to provide the best experience for students. We will clearly need to focus more on some of our student survey scores, and get to the bottom of why we have such a wide variety of reported experiences.

Next steps for us will be:  how we review our student survey outcomes; how we deliver our new employability strategy; how we ensure that we use the information from module evaluations and teaching observations to optimise student success, and how we review the performances of all of our courses.

There will no doubt be an ongoing resistance to TEF – the metrics chosen are still not ideal, and when we move to looking at subject level analysis, then there will be concerns regarding reliability of data – but this is a system we are going to have to work with. It would make sense to make sure we are best prepared as we can be.

Complete University Guide 2017

The first of the major University league tables, the Complete University Guide, is published today.

This table uses metrics  on ten measures: Student Satisfaction, Research Quality, Research Intensity, Entry Standards, Student: Staff Ratio; Spending on Academic Services; Spending on Student Facilities; Good Honours degrees achieved; Graduate Prospects and Completion.

From the CUG press release:

Dr Bernard Kingston, principal author of TheCompleteUniversityGuide.co.uk, said: “There is a considerable degree of stability at the upper end of the league table this year. While dramatic changes may be newsworthy, this stability indicates that the rankings are robust and credible for young people seeking a university place – our primary purpose.”
This year’s release sees TheCompleteUniversityGuide.co.uk publish a number of new rankings. Alongside new subject tables for Creative Writing, Forensic Science and Occupational Therapy, there is now a Creative and Performance Arts table, containing 14 institutions that do not feature in the Main Table.
Dr Kingston said: “We have simultaneously released a survey of universities’ relative success in resolving student complaints. This shows significant variations between universities and is an important source of information for prospective students who what to know that their complaints will be effectively resolved.” (See attached Press Release and Table).
“Last year’s Higher Education Green Paper, Higher education: teaching excellence, social mobility and student choice, stated that applicants need access to robust, timely and objective information, based on criteria that are straightforward and easily understood.

So the top ten are:

2017

Position

2016

Position

Change Institution
1 -1 0 Cambridge
2 -2 0 Oxford
3 -3 0 London School of Economics
4 -4 0 Imperial College London
5 -5 0 St Andrews
6 -5 -1 Durham
7 -11 4 Loughborough
8 -7 -1 Warwick
9 -9 0 Lancaster
10 -13 3 University College London

Not really any surprises there. Staffordshire falls 6 places to 109th.

What is always of more interest are the big movers, both up and down, and the identification through reading the individual subject tables to see why these changes have happened.

So this year’s big winners are:

  • Manchester Met – up 16
  • Harper Adams – up 14
  • Buckingham – up 14
  • Liverpool Hope – up 14
  • Sunderland – up 14
  • Falmouth – up 12
  • Winchester – up 12
  • Edge Hill – up 11
  • Middlesex – up 11

At the other end we have

  • Oxford Brookes – down 11
  • St Mark and St John – down 12
  • Brighton – down 14
  • Queen Margaret – down 15
  • Royal Agricultural University – down 17
  • Arts University Bournemouth – down 19

The section on complaints and their resolution will be of interest to academic registrars. Over a 3 year period, the number of completion of procedure letters issued after exhausting internal complaints process per 1000 students is ranked. The ranking here does not follow any meaningful pattern – it might be assumed that students at one type of university are more likely to use a complaints procedure than others, or that part of the sector would be better at dealing with complaints but this is clearly not the case. Pleasingly, the figure quoted for Staffordshire is considerably better than for some. Whether this data is of any meaningful us to prospective students is debatable.

 

League Tables – WhatUni Guide

Before the heavy hitting university league tables are produced, we get the results of some others.

First is the Whatuni.com guide, which is produced based on student views.

The to 10 this year were:

  1. Harper Adams University
  2. Loughborough University
  3. Swansea University
  4. Bangor University
  5. University of Leeds
  6. University of Exeter
  7. Nottingham Trent University
  8. University of South Wales
  9. St Mary’s University, Twickenham
  10.  Leeds College of Art

So clearly not the same as one of the “normal” league tables.

Staffordshire has risen from 96th in 2015 to 76th in 2016, so it’s pleasing to see a better response from our students.

 

Earnings by Course and University

As revealed in legislation last year, the government has been keen to see the impact of subject studied, and where, on the earnings of graduates.. The initial research has now been carried out by the Institute for Fiscal Studies, and looks at data that is more long term that the current DLHE data, and crucially considers student loan repayments and  tax returns.

Graduates from richer family backgrounds earn significantly more after graduation than their poorer counterparts, even after completing the same degrees from the same universities. This is one of many findings in new research published today which looks at the link between earnings and students’ background, degree subject and university attended

Having carried out the research, some of the the findings could be considered as underwhelming:

  • students from wealthy backgrounds out-earn others, when studying the same subject at the same institution
  • graduates in creative arts earn less than others.

Inevitably the reaction from some places has been to roll out the “more means worse” arguments, for instance here in the Daily Telegraph, where Fraser Nelson writes:

If a book is ever written about the mis-selling of higher education, it might start with such adverts. There’s no doubt that doctors and lawyers earn a bomb; no doubt that an Oxbridge degree opens many gilded doors. But studying urban dance at Peckham University or media studies at the University of Scunthorpe is another story entirely.

Yes, the average graduate premium may be generous. But today, all too many ropey institutions hide behind the word “university” – offering dismal courses that serve neither students nor society. And by the time the students realise that they’ve been sold a pup, it’s too late.

A more detailed reading of the paper would reveal that although there may be 23 institutions where the median salary for male graduates is lower than for non graduates (as shown in this almost indecipherable graph), the authors state:

ifs1

At the other end of the spectrum, there were some institutions (23 for men and 9 for women) where the median graduate earnings were less than those of the median non  graduate ten years on. It is important to put this in some context though. Many English higher education institutions draw a signicant majority of their students from people living in their own region. Given regional differences in average wages, some very locally focused institutions may struggle to produce graduates whose wages outpace England-wide earnings, which include those living in London etc. To illustrate regional differences, employment rates in the period under consideration varied between66% in the North East and 75% in the East of England, and data from the Annual Survey of Hoursand Earnings suggests that average full-time earnings for males were approximately 48% higher
in London than in Northern Ireland, and around 34% higher for females. Regional differences are therefore important and we take them into account in our analysis of graduates’ earnings.

 

More interestingly though is how this data might be used in the future. In  this paper, the authors have not published results against all named institutions, although most of the Russell Group universities are named. In future, the intention would be to show this. One argument could be to use the data to allow differential fees, or to have differential RAB charges by subject or institution. Alternatively the information could be used to provide better student information and to challenge policies on social mobility. A recent article in the Times Higher looks at the different views from across the sector.

A clear message for us however might be to continue with our focus on developing students’ employability skills and being prepared to make sure that these skills which might currently be missing, are deeply embedded into courses or into extra-curricular activities. For instance, we can do more to develop numeracy and digital capability skills, by understanding exactly what it is that potential employers want to see in the graduates that they employ.

More challenging is around the issue of social capital. As a university that has at its heart a belief in education as a transformational activity, and a commitment to widening participation, we might do well to understand more how we can help our students develop social and cultural capital – without this they will always find t more difficult than those for whom university was an expected rite of passage. It’s very likely that for many students- especially those who are local or who commute in daily – that their sense of bonding capital is high. The corollary is that the level of bridging capital – that which they need to develop new networks – is lower than for students with different backgrounds. Identifying activities that will help our students develop this could be key. Some possible areas are placements, internships, and cross-disciplinary projects, where students have to work on real world problems but with student from other subjects, to pull them out of their comfort zone.

Over the next few weeks it will be instructive to see how politicians react to this new data, and from this for us to identify specifically what we should do to respond.

 

 

 

 

Differences in Student Outcomes

Successful outcomes for students are often used as a proxy for institutional quality, hence the use of good degree outcomes, or value added, in league tables. The forthcoming Teaching Excellence Framework will almost certainly look at student outcomes as a measure also. However, not all students succeed equally, and we know from our own work at StaffsUni of the gaps in attainment between different groups of students.

The recent Green Paper, as well as highlighting the possible future TEF, indicates the government’s desire to see an increase in numbers of students from the most disadvantaged backgrounds as well as looking to ensure that all students can achieve.

In the light of this, last Monday I attended a HEFCE conference in London “Addressing differences in student outcomes: Developing strategic responses”, which looked at the findings of research into differential outcomes from Kings College London, and was an opportunity to hear from others in the sector on how they are tackling these issues.

Sessions attended were: the introduction by Chris Millward, Director of Policy at HEFCE; a presentation by Anna Mountford Zimnars of KCL;  a session by Sorana Vieru and Malia Bouattia  of NUS, and finally a session by Philip Plowden, DVC of University of Derby.

These are my notes of the day. Copies of the presentations can be viewed here.

Chris Millward HEFCE Director of Policy

Chris Milward started by considering where the government is on this agenda, linking the Green paper, the Treasury plan and plans from BIS.

Government wants to see a more diverse range of backgrounds in HE, in terms of entry, success and outcomes. For instance: double the number of students from disadvantaged backgrounds by 2020; an increase in the number of BME students by 20% by 2020, and to the sector to address differences in outcomes.

This means more responsibility for universities together with strengthened guidance to OFFA and the potential role of the Office for Students. There is an anticipated stronger role in quality assurance processes through the impact of TEF and the future need to measure difference in outcomes based on data and metrics agreed by government. This will lead to more targeted funding together with more emphasis on meeting obligations.

The HEFCE analysis shows an attainment gap for BME students, based on A-level analysis and the more that you add in other factors, the bigger the gaps become.

In addition, when looking at POLAR3 domicile, then there are further unexplained HE outcomes.

When considering students with disability, then the data suggests that those students who received DSA support perform above average, while those without perform less well.

On postgraduate progression, there is currently an unexplained difference in outcomes based on POLAR3 quintiles.

When considering employment and looking at the 40 month survey rather than the 6 month DLHE, all POLAR3 quintiles have worse outcomes than quintile 5 and for professional employment in particular. There are worse outcomes for students with disability, irrespective of DSA and there are worse employment outcomes for all categories of BME students and particularly in professional employment. Finally on gender, men perform worse overall on employment, but better in professional employment.

The HEFCE approaches to working on closing the gaps in outcomes include:

  • National outreach programme
  • Funding for disabled
  • Supporting successful outcomes
  • Catalyst fund

ANNA MOUNTFORD ZIMNARS – KCL

Dr Zimnars presented the outcomes of major piece of research into differential outcomes, which is available here.

“Access without success is no opportunity”

The research considered three questions:

  • What is the pattern- empirical?
  • How do we explain it – causal model?
  • How do we change it effectively- policy and empirical?

The question was asked – “Do we need causality- if intervention works, does the causal model matter?”

Explained pattern of differential attainment using model that looked through a lens of macro/meso/micro  levels and at experiences of preHE, HE and postHE.

4 explanatory dimensions were proposed:

  • Curricula and learning
  • Relationships -sense of belonging probably the most important factor
  • Cultural, social and economic capital
  • Psychosocial and identity factors

From the research, which involved asking questions of a large number of institutions, the level of awareness of the issue differed across institutions, although this may be changing now, possibly due to the proposals in TEF.

In terms of those institutions that tackled the differential outcomes issues the most successfully:

  • Whole institution effect is most successful
  • Need students academics and prof services working together
  • Bottom up approaches with strategic support
  • Universal and targeted interventions

Effective interventions were seen to be:

  • Improvements to T&L
  • Inclusive learning and curricula
  • Deconstructing assessment
  • Meaningful interactions
  • Role models and mentoring
  • Engagement with institution
  • Generally few evaluations especially a lack of long term evaluations

Ended with 5 groups of recommendations

  • Evidence base
  • Raising awareness
  • Embedding agenda
  • Staff as change agents
  • Students as change agents

Sorana Vieru and Malia Bouattia  NUS

 This presentation started from a previous NUS report, Race for Equality, and went on to look at a new NUS campaign on liberating the curriculum.

From previous NUS work, 42% of students said that the curriculum did not reflect their experiences particularly in history and philosophy. As well as looking at students as being in one particular demographic group, it was important to look at intersections between groups.

Work from NUS highlighted:

  • 23% of black students described learning environment as cliquey
  • Disabled students more dissatisfied in NSS
  • 10% of trans students not willing to speak up in class
  • Black students report lower levels of satisfaction on NSS on assessment and feedback

There was a focus on liberation-equality-diversity and the launch of a new campaign – “Liberate my Degree”. An online hub has been provided with resources for officers and reps with training resources to allow them to engage in debate in their institutions and to support becoming co-creators of curriculum.

Getting there  – Helen Hathaway Philip Plowden

Speakers from University of Derby showed the pragmatic steps they have taken to challenge the gap in attainment between white and BME students.

In terms of background, the University has 28000 students, most of whom were state school sector. 20% of these self-identified as BME. The attainment gap was 24.6% in 2009-10.  The impact of the work so far is the gap has closed to 12.4% in 14-15, although there was an increase in attainment across all areas this is a moving target.

Important thing is that there is no one single answer, so there was a need to stop looking and focus on the myriad interventions and see what impact they have.

  • No magic bullet
  • Post racial inclusive approach
  • Suite of different strategies needed

Four main areas of interventions are used: Relationships, academic processes, psychological processes, and social capital.

The project at Derby explored data (down to module level) and relied on the regular Programme health checks which used a digest of metrics including attainment by ethnicity. In these, the DVC meets with programme leads to engage with course teams at chalk face. Areas covered include: outcomes,  finances reliance on clearing, and staff numbers. In particular the programme health checks looked at “spiky” degree profiles- looking at individual modules and gaps, not with an intention to play a blame game but to ask what is going right and ask others to consider that.

To support interventions, Derby developed PReSS- practical recipes for student success whch contains evaluations and case studies and can be used from: Http://uodpress.wordpress.com

The key lessons learned were:

  • No simple solution. Paralysis by analysis. Just have to crack on and do what works.
  • Learn from others
  • Post racial inclusive approach. Difficult to reconcile this with some of the morning’s talk. Is this unduly dismissive of liberation approaches
  • Importance of communication -degree of profile. But once in the mainstream it might get lost.
  • Need consistent way to measure attainment gap.
  • Important to evaluate interventions.

Points from Discussions

A lively discussion followed, and the following are just snippets of some of the topics – in some cases these reflect discussion we have had in our own institution, but I add them in almost as provocations for further debate.

  • Is there a threat to academic staff when we discuss this BME and other attainment gaps? A danger of appearing accusatory?
  • Why are there difference between subjects such as business and nursing – do cohorts have an impact? Why do the subjects with the smallest attainment gaps want to engage in the debate the most?
  • How do we check who uses the resources to support inclusive learning, and should we check?
  • How do you liberate the curriculum and how do we re-educate staff to draw on a wider range of ideas, since they are a product of their own subject and environment?
  • What about the Attainment gap for students who live at home where home life and working gets in the way of study?

Conclusions

In all, a thought provoking day. A lot of emphasis, as always on the BME attainment gap, but also more opportunity to explore attainment more generally and to recognise how this agenda will become increasingly important post-TEF.

In terms of what we could do next, then as we develop better internal metrics of modules and courses, we can start to see how we can use this information to understand better the outcomes that our students achieve. Linking this to revisions in the way in which we review our courses, both from a quality assurance and enhancement perspective, as well as a more data-centric health check would provide the opportunity to have the right discussions, to ensure that we maximise the opportunities for our students to be successful.

 

Non Continuation Rates

Last week, HESA published their latest data on student continuation rates.. An important set of figures for a number of reasons: non-continuation is something that directly affects the finance of universities; non-continuation is potentially a failure for the individual as well as the institution, and finally this data is used in some league tables.

A concern is that overall, the non continuation rate has risen across the sector (and indeed for us at Staffordshire University), with the national figure rising from 5.7% to 6.0% of students who entered in 2013-14 not progressing to the second year.The headline statistics are

  • 6.0% of UK domiciled, young, full-time, first degree entrants in 2013/14 did not continue in higher education in 2014/15.
  • 10.2% of UK domiciled, full-time, first degree starters in 2013/14 were projected to leave higher education without gaining a qualification

Usefully, HESA provides breakdowns of the data by both age of students as well as POLAR3 low participation indicator. This doesn’t necessarily provide any greater detail than that already held by any individual institution, but it does allow for comparisons to be made against comparators.

Looking at the data for Staffordshire University we can see that :

  Percentage no longer in HE (%) Benchmark (%)
young entrants 12.2 10.1
mature entrants 14.1 13.8
all entrants 12.8 11.4
     
young entrants from low participation neighbourhoods 15.7 11.2
young from all other neighbourghoods 11.2 9.7

So, no surprises there, but it does add to weight to the argument that we should revise the way in which we look at the necessary interventions to support retention. If, as is evidenced here, there are groups of students who are more likely to withdraw than others, then a “one size fits all” approach to student retention will not deliver all the necessary outcomes.

In addition, HESA provide data on non continuation rates based on subject studied as well as entry tariff and types of qualifications. The rates compared to entry are summarised as:

Entry qualifications All subjects
   
01 A level/VCE/Advanced Higher grades AAAA or Scottish Highers grades AAAAAA 1.4%
02 A level/VCE/Advanced Higher grades at least AAA or Scottish Highers grades at least AAAAA 1.8%
03 A level/VCE/Advanced Higher grades at least AAB or Scottish Highers grades at least AAAAB or AAAAC or AAABB 2.5%
04 A level/VCE/Advanced Higher grades at least AAC 3.1%
05 A level/VCE/Advanced Higher grades at least ABB or Scottish Highers grades at least AAABC or AAACC or AABBB or AABBC 3.1%
06 A level/VCE/Advanced Higher grades at least ABC or BBB or Scottish Highers grades at least AABCC or ABBBC or ABBBCC or ABBBB or BBBBB 3.9%
07 A level/VCE/Advanced Higher grades at least ACC or BBC or Scottish Highers grades at least AACCC or ABCCC or BBBBC or BBBCC 3.9%
08 A level/VCE/Advanced Higher grades at least BCC or CCC or Scottish Highers grades at least ACCCC or BBCCC or BCCCC or CCCCC 4.2%
09 Tariff points > 290 4.8%
10 Tariff points > 260 5.3%
11 Tariff points > 230 6.6%
12 Tariff points > 200 7.4%
13 Tariff points > 160 9.2%
14 Tariff points > 100 11.3%
15 Tariff points > 0 12.9%
17 Level 3 and A level equivalent qualifications with unknown points 13.9%
19 International Baccalaureate 3.4%
20 HE level foundation course 6.1%
21 Access course 11.1%
22 BTEC 11.5%
23 Higher education qualification – Postgraduate 7.1%
24 Higher education qualification – First degree 7.6%
25 Higher education qualification – Other undergraduate 8.1%
26 No previous qualification 24.1%
27 Other qualifications not given elsewhere 17.0%
28 Unknown qualification 32.6%
   
All qualifications 6.0%

Or looking at this graphically:

  
Important lessons from this data? As A level tariff points decrease, then the likelihood of non-continuation increases. Also, for institutions or courses that recruit significant numbers of students with BTEC qualifications, then higher withdrawal rates might be expected

Putting these factors together: age, POLAR3 neighbourhood, subject and entry grades, we can use better data analytics, linked to market segmentation and enhanced personal tutoring, to identify how to provide  right support to all students, but in a way that is tailored to their needs and expectations. The key part of this will not be the identification of possible at risk students – the more difficult work will be in deciding what are the interventions needed to support an increasingly diverse range of students, and how to deliver this.

Ultimately, we want all of our students to succeed, and if we have decided that these are the people that we want to educate, then we have to provide the best opportunities for that success.

 

HEFCE Revised Operating Model for Quality Assessment

Last week HEFCE published their revised operating model for quality assessment. This is based on the responses from the sector consultation that took place last year, and where we, and many other universities, identified areas that were of concern to us. Some of these have been addressed. However, this is also part of the current sectoral land grab to have the responsibility for qualit; at the same time as publishing, HEFCE has put out to tender various aspects of its quality work.

Key points to note from the revised operating model:

  • “future quality assessment arrangements should seek to encourage innovation in learning and teaching, rather than driving providers towards risk-averse activities and homogenised provision.”
  • “approach for implementation is therefore designed to be proportionate, risk-based and grounded in the context of each individual provider and its students”
  • a set of baseline regulatory requirements will still based on parts of the existing quality code and the framework fr higher education qualifications
  • fore new entrants there will be a gateway process followed by a developmental period of enhanced scrutiny and support
  • for established providers, a review of their own review processes, followed by a data-based Annual Provider Review and a revised periodic review visit

Some common areas of contention from responses from the sector were: comparability of standards; a potential national register of external examiners, and the roe of governing bodies.

A large section of the document covers comparability of standards, and classification algorithms used.The document states that when reviewing the original proposals:

Arguments mobilised against the proposals included:
• an opposition in principle to the funding bodies acting in an area where institutional autonomy is prized
• a view that there was no particular problem to be resolved, or that the specific proposals would not resolve whatever problems might exist
• a series of more practical concerns relating to increasing the burden on external examiners, thereby disincentivising the people on whom the successful operation of the system depends.

But that “student and PSRB respondents were much clearer that modernisation in this area was important, with some suggesting that the proposed reforms did not go far enough”

HEFCE have moved away from the proposal for a national register of external examiners, but talk instead of training of examiners to ensure that they are able to check comparability of standards – there is still a worry that good degree rates are rising and that these may not be defensible

The role of governors was an area that may universities had plenty to say about in the response to consultation, where it was felt that governing bodies may not be best placed to make direct judgements about academic quality. Again, HEFCE have clarified their expectation:

The role of the governing body would be to receive reports and challenge assurances from within the institution. It should not be drawn into quality management activities itself. We recognise the predominant role of senates and academic boards (or equivalent) in academic governance, and the responsibility of the accountable officer and senior executive team, and would expect an individual governing body to be clear about the formal relationships between the elements of the governance arrangements in its own institutional context.

There’s plenty more to digest. As always, WonkHe have a guide to how the new system will work, written by Louisa Darian.

What will be interesting now is the transitional arrangements and the pilots to be run during 2016-17.

 

 

 

 

Times Higher Student Experience Survey 2016

Just before we enter league table season, the THE kicks off with their Student Experience Survey results.

This year the top university is Loughborough, followed by our geographical neighbours, Harper Adams, and then Sheffield.

Here’s what the VC of Loughborough attributes the success to:

Robert Allison, vice-chancellor of Loughborough, says that coming first in this year’s student poll was “absolutely fantastic, as it recognises all the excellent things that staff and students are doing here”.

At the heart of Loughborough’s success is the ethos that students should work with staff to create a good university experience for everyone on campus, Allison says. “When people visit us on open days, I tell them that if they’re wondering if they’ll have a TV in their room, this probably isn’t the ­university for them.”

At Loughborough “you can really embed yourself in the university, and if you do, you will have all sorts of chances and opportunities”, he continues.

For instance, final-year students often partici­pate in a research project, while others take part in international secondments, such as those enjoyed by mechanical engineering students who have just returned from visiting the Massachusetts Institute of Technology.

“If you have that desire to co-create your university experience, rather than just seeing yourself as someone who shows up for 10 weeks a term, it takes you to a different level as a student,” Allison says.

As always, this is a survey based on a very small sample size compared with NSS, but the outcomes are still interesting.

Staffordshire has risen 10 places to 78th this year. In terms of where we do well, we can look to see where our scores exceed the sector average:

  • helpful/interested staff
  • personal requirements catered for
  • good personal relationships with teaching staff
  • cheap shop/bar/amenities
  • tuition in small groups
  • fair workload

So as we might expect, we do well in the way we work with our students, and we know that Stoke in Trent is a relatively cheap city in which to be a student.

Areas where we seem to be falling behind are around social life, community atmosphere and environment on campus. Our ongoing investment in campus transformation should god a long way to address this, and by September 2016 when all of our computing, music, film and games students arrive onto the main redeveloped campus, we should find ourselves working in an even more lively environment.