Different Approaches to KIS

Next week I am delivering a session to one of our faulty management teams on Key Information Sets (KIS) – specifically trying to help colleagues generate a better understanding,  especially of what data can and can’t be influenced by the university.

Ultimately, all of the factors reported can be influenced – they only show what has been previously measured  either by an institutional response specific to KIS, through HESA returns, or through the opinions and outcomes of our former students.

As Graham Gibbs has pointed out – see earlier post – the factors being measured and reported do not necessarily represent those, that would necessarily lead to educational gains for students  after interventions.

However, I was interested to see what our two main rivals (hem hem) Oxford and Cambridge do….

Cambridge, under each KIS widget, provide a list of reasons why KIS should not be considered in isolation, and may not be the most useful way of making comparisons.

Oxford on the other hand are more direct. They have their own graphic  right above the KIS one with a clear statement of what their students will get. Maybe we could do the same with Staffordshire Graduate?

Oxford KIS button

UCAS Enrolment figures for 2011 and 2012 – lessons to be learnt?

The latest UCAS figures for the last cycles of enrollments and acceptances has been published, and commented upon in the Higher.

“Data released by the Universities and Colleges Admissions Service on 18 January reveal the full extent of shortfalls in undergraduate numbers by institution, showing that the government’s reforms have produced often wild variations in recruitment.

The figures relate to acceptances, meaning actual enrolments could be different. They also include overseas students who applied via Ucas. However, the data do not include students who applied directly (via access courses, for example).

Some in the sector suggest that declining post-1992 university figures indicate that students from disadvantaged backgrounds are most likely to decide against higher education with higher fees”

although

“The post-1992s that increased their intake appear to be largely drawn from those that won places under the margin system, which reallocates places to cheaper institutions.”

 

The table below shows the percentage change between 2011 and 2012, sorted by size of change. Staffs comes out pretty much mid-table this way, and it is interesting to see where our competitors are.

Institution Name 2012 2011 Change in number of places Change %
American InterContinental University – London 56 6 50 833.3
Birkbeck, University of London 664 289 375 129.8
Richmond, The American International University in London 89 40 49 122.5
Regent’s College, London (incorporating Regent’s Business School, London) 78 46 32 69.6
European Business School, London 100 68 32 47.1
University of Stirling 1765 1253 512 40.9
University of Bristol 4717 3688 1029 27.9
University College London 4397 3617 780 21.6
University of Aberdeen 2900 2400 500 20.8
Glasgow School of Art 389 334 55 16.5
Ravensbourne 811 705 106 15.0
Royal Veterinary College, University of London 373 327 46 14.1
Royal Academy of Dance 58 51 7 13.7
School of Oriental and African Studies, University of London 986 870 116 13.3
Cardiff University 5799 5130 669 13.0
King’s College London, University of London 4331 3881 450 11.6
London School of Economics and Political Science, University of London 1416 1271 145 11.4
University of Edinburgh 5474 4951 523 10.6
York St John University 1638 1492 146 9.8
Queen Margaret University , Edinburgh 1107 1014 93 9.2
University of Winchester 1915 1756 159 9.1
Harper Adams University 792 731 61 8.3
BPP University College 1041 962 79 8.2
Aston University 2141 1992 149 7.5
Anglia Ruskin University 4362 4,065 297 7.3
Newcastle University 4669 4357 312 7.2
Coventry University 5432 5107 325 6.4
University of Glasgow 4406 4149 257 6.2
Durham University 4026 3800 226 5.9
University of Ulster 5675 5360 315 5.9
University of Bath 2990 2832 158 5.6
Bishop Grosseteste University 612 580 32 5.5
University of Chichester 1589 1511 78 5.2
Queen’s University Belfast 3920 3736 184 4.9
Robert Gordon University 2423 2321 102 4.4
University of Chester 3064 2936 128 4.4
University of Cambridge 3401 3261 140 4.3
Norwich University Of The Arts 635 609 26 4.3
Central School of Speech and Drama, University of London 233 224 9 4.0
Brighton and Sussex Medical School 141 136 5 3.7
Hull York Medical School 147 142 5 3.5
Goldsmiths, University of London 1800 1743 57 3.3
University of Exeter 4356 4220 136 3.2
University of Strathclyde 2899 2822 77 2.7
Falmouth University 1443 1410 33 2.3
Buckinghamshire New University 2379 2327 52 2.2
University of Huddersfield 4962 4866 96 2.0
Birmingham City University 5308 5214 94 1.8
University of Brighton 5417 5326 91 1.7
St George’s, University of London (formerly St George’s Hospital Medical School) 657 646 11 1.7
Glasgow Caledonian University 3530 3473 57 1.6
Bangor University 2410 2372 38 1.6
University of Oxford 3281 3237 44 1.4
University of York 3749 3701 48 1.3
University of East Anglia 3540 3505 35 1.0
St Mary’s University College, Twickenham 1245 1237 8 0.6
University of Sussex 3221 3203 18 0.6
University of the Highlands and Islands 2161 2153 8 0.4
University of Leicester 3114 3113 1 0.0
University of Reading 2945 2948 -3 -0.1
University of Portsmouth 5289 5305 -16 -0.3
University of Nottingham 7160 7187 -27 -0.4
University of Warwick 3828 3846 -18 -0.5
UCP Marjon 800 806 -6 -0.7
Swansea Metropolitan University 1425 1446 -21 -1.5
Rose Bruford College 213 217 -4 -1.8
University College Birmingham 1312 1337 -25 -1.9
University of Northampton 3017 3079 -62 -2.0
Cardiff Metropolitan University 2646 2704 -58 -2.1
Loughborough University 3359 3439 -80 -2.3
University of St Andrews 1696 1741 -45 -2.6
Staffordshire University 3790 3895 -105 -2.7
Southampton Solent University 3810 3920 -110 -2.8
Heriot-Watt University, Edinburgh 1814 1872 -58 -3.1
University of Manchester 7861 8114 -253 -3.1
Royal Holloway, University of London 2375 2452 -77 -3.1
Oxford Brookes University 3810 3934 -124 -3.2
Lancaster University 2778 2882 -104 -3.6
Courtauld Institute of Art, University of London 51 53 -2 -3.8
Leeds Trinity University 870 905 -35 -3.9
City University 2880 2997 -117 -3.9
London South Bank University 3740 3893 -153 -3.9
University of Abertay Dundee 1242 1301 -59 -4.5
Bath Spa University 1919 2021 -102 -5.0
University of Wolverhampton 4587 4855 -268 -5.5
Arts University College at Bournemouth 953 1011 -58 -5.7
Queen Mary, University of London 3484 3704 -220 -5.9
Keele University 2023 2153 -130 -6.0
University of Leeds 6428 6844 -416 -6.1
Imperial College London 2226 2377 -151 -6.4
University of the West of Scotland 4228 4516 -288 -6.4
Canterbury Christ Church University 3675 3927 -252 -6.4
University of Kent 4942 5281 -339 -6.4
University of Birmingham 5135 5520 -385 -7.0
Nottingham Trent University 6356 6857 -501 -7.3
ifs School of Finance 36 39 -3 -7.7
University of the Arts London 4305 4665 -360 -7.7
University of Gloucestershire 2215 2403 -188 -7.8
Stranmillis University College: A College of Queen’s University Belfast 244 265 -21 -7.9
University of Worcester 2681 2919 -238 -8.2
University of Essex 2907 3166 -259 -8.2
Roehampton University 2254 2455 -201 -8.2
Liverpool Institute for Performing Arts 219 240 -21 -8.8
Kingston University 6210 6809 -599 -8.8
Edge Hill University 3550 3900 -350 -9.0
Northumbria University 5714 6290 -576 -9.2
University of Sheffield 4711 5197 -486 -9.4
Sheffield Hallam University 7425 8211 -786 -9.6
University of the West of England, Bristol 6584 7284 -700 -9.6
Edinburgh Napier University 3480 3854 -374 -9.7
University of Liverpool 3945 4369 -424 -9.7
Bournemouth University 3920 4342 -422 -9.7
Middlesex University 4139 4619 -480 -10.4
University of Hull 4356 4880 -524 -10.7
De Montfort University 4638 5230 -592 -11.3
University of Bedfordshire 3815 4303 -488 -11.3
University of Westminster 4503 5088 -585 -11.5
University of Sunderland 2503 2831 -328 -11.6
Swansea University 2939 3326 -387 -11.6
British School of Osteopathy 69 79 -10 -12.7
Plymouth University 5173 5923 -750 -12.7
University of West London 2394 2742 -348 -12.7
Liverpool John Moores University 5473 6284 -811 -12.9
University of Southampton 4499 5189 -690 -13.3
Brunel University 2821 3279 -458 -14.0
Royal Agricultural College 313 364 -51 -14.0
University of Derby 3002 3555 -553 -15.6
University of Lincoln 3135 3715 -580 -15.6
University of Hertfordshire 4730 5618 -888 -15.8
Manchester Metropolitan University 7642 9083 -1441 -15.9
Liverpool Hope University 1476 1760 -284 -16.1
University of Central Lancashire 5318 6355 -1037 -16.3
University for the Creative Arts 1727 2064 -337 -16.3
University of Surrey 2104 2515 -411 -16.3
University of Salford 3953 4808 -855 -17.8
University of London Institute in Paris 49 60 -11 -18.3
University of Cumbria 1950 2391 -441 -18.4
University of Bradford 2748 3377 -629 -18.6
University of Dundee 2141 2637 -496 -18.8
Newman University College, Birmingham 535 659 -124 -18.8
Aberystwyth University 2655 3283 -628 -19.1
University of Wales, Newport 1147 1426 -279 -19.6
University of East London 4385 5510 -1125 -20.4
University of Glamorgan, Cardiff and Pontypridd 3265 4105 -840 -20.5
Glyndwr University 731 926 -195 -21.1
Scottish Agricultural College 275 350 -75 -21.4
Heythrop College, University of London 149 192 -43 -22.4
Leeds Metropolitan University 6265 8084 -1819 -22.5
University of Greenwich 4034 5223 -1189 -22.8
University Campus Suffolk 1398 1811 -413 -22.8
Peninsula College of Medicine & Dentistry 239 319 -80 -25.1
University of Bolton 1259 1686 -427 -25.3
University of Wales Trinity Saint David 821 1156 -335 -29.0
University of Buckingham 157 244 -87 -35.7
Writtle College 293 458 -165 -36.0
London Metropolitan University 4079 7209 -3130 -43.4
Royal Welsh College of Music and Drama (Coleg Brenhinol Cerdd a Drama Cymru) 31 60 -29 -48.3
Institute of Education, University of London 28

Implications of ‘Dimensions of quality’ in a market environment

Graham Gibbs has published, through HEA, a further paper on Dimensions of Quality, now considering the market environment in which we supposedly operate and some of the implications, which are reported below, together with some of my comments. The original paper can be accessed here.

1 This report concerns the practical implications of the use of
performance indicators for the way institutions are currently attempting
to attract students, improve quality, improve ‘value for money’, and
improve their relative standing in relation to educational provision.
Institutions are responding to this data-driven market in a variety of
ways, some of them perhaps unexpected and some with probably
negative consequences. The report suggests ways in which the use of
data in a market could be tuned up to have more positive effects.


2 The conclusions of the report are based on:
• examination of the data currently available to students and used by
institutions, and their validity and usefulness;
• literature about performance indicators in higher education, and also
literature about the effect that performance indicators and markets
have on the behaviour of organisations in any public sector, such as
schools and hospitals;
• meetings with those senior managers responsible for educational
quality within institutions, both in national gatherings and through
interviews within 12 institutions of a wide variety of types;
• examination of institutional documentation, for example about how
quality data are reported and used internally, and institutional responses
to the Browne Report.


3 It is not yet clear whether institutional attempts to improve National
Student Survey (NSS) scores and other quality indicators is having any
effect on student recruitment, let alone on learning gains. To a large
extent the market is perceived to be driven by reputation, just as in the
past. US research shows that reputation tells you almost nothing about
educational quality, use of effective educational practices, or learning
gains, but merely reflects research performance, resources and fee levels.
It is uncertain whether the use of more valid indicators of educational
quality will gradually change perceptions of what reputation is about, and
turn it into a more useful guide to student choice.

 

4 Data currently provided to potential students, such as Key Information
Sets (KIS), and used by institutions to make decisions, include some
valid indicators of educational quality and also include variables that are
invalid or difficult to interpret. There is scope to improve the value of the
information provided to students, and used by institutions, by changing
some of the variables and collecting and collating somewhat different data.
In particular it is not yet possible for students to see what educational
provision their fees will purchase (such as class size, which predicts learning
gains) other than the proportion of class contact hours (which does not
predict learning gains).

Is this an area of opportunity for individual institutions, to make the most of information that they hold regarding class sizes, and who is doing the actual teaching?
5 The aspects of educational provision that institutions pay attention to in
their internal quality assurance processes often overlook crucial indicators.
Any new quality regime should ensure that it focuses on the right variables,
and the use of valid quality indicators in KIS and elsewhere would help to
lever appropriate attention.


6 Regardless of the validity of currently available data, institutional behaviour
is being driven by data to an unprecedented extent. In most institutions
there is now an annual cycle of analysis of performance indicators at both
institutional and departmental level, followed by planning to improve them,
again at both institutional and departmental level. Departments are much
more aware of how their competitors at other institutions perform, in
relation to the main indicators. In some cases this annual analysis of data
has in effect taken over from periodic review and QAA audit as the main
driver of quality assurance and enhancement (and without this having
been planned or agreed). Any future revision of national quality assurance
mechanisms, and requirements on institutions, will need to take this reality
into account.

We have the opportunity to embed our own portfolio performance review tool into our annual processes, while at the same time reviewing how we carry out internal annual monitoring as well as responses to student surveys. We might be able to do more if we join all of these activities together, rather than seeing them as separate and distinct.
7 Most currently available data are about degree programmes, and students
apply to study degree programmes. In contrast much quality assurance,
and course design and documentation, has focused on individual modules.
In modular course structures the collection of modules that students
experience may relate loosely to the unit of analysis of the NSS. This
confronts modular institutions and modular degree programmes with major
problems in interpreting and acting on the degree-programme-level data
from the NSS. A consequence is that some institutions are greatly reducing
the number of combined Honours degrees offered and moving away from
modularity back to traditional single subject degree programmes with greater
alignment of student experience with the unit of analysis, and labelling, of
public indicators of quality. There are consequences of this shift for the
diversity of curricula and for student choice, which may have negative impacts.

This is an interesting assertion. Many institutions have moved away from joint programmes for other reasons: the confused market offer; inefficiencies in delivery and poor student experience where there can be a lack of belonging . However, it is true that as we reduce portfolio, we do potentially reduce the diversity of curricula within remaining awards. The difficulty of associating awards with published NSS results is well known – more work can be done here to make sure we understand which awards are categorised where (and why!), so that sensible interpretation of published data can be undertaken, and the right changes made.

 

8 There has been a considerable emphasis over the past decade on
training and accrediting individual teachers, rewarding individual teachers,
and on funding local innovation in teaching. There is a marked lack of
corresponding institutional emphasis on the effective operation of
‘programme teams’ (all those who contribute to the teaching of a degree
programme), on developing leadership of teaching, and on curriculum
design and assessment at programme level. A change of focus of national
and institutional enhancement efforts is overdue. Institutional career
structures still need to be developed that reward leadership of teaching,
rather than only individual research and individual teaching. Funding for
innovation, both within institutions and by national bodies, should be
targeted on programmes rather than on modules and on the involvement
of entire programme teams rather than on individuals.

Do we know enough about teams? For instance  all new staff are required to complete a PGCHPE, But what are we doing about experienced staff, and those non-teaching staff who are associated with the programme?
9 Many institutions are using data to identify a previously overlooked quality
problem and address it: the most common example is poor and slow
feedback to students on their assignments. Institutions are then making
very broad scale changes that affect all degree programmes and all
teachers in order to address these problems. Data are successfully driving
change and in some cases there is clear evidence of improvements in NSS
scores as a consequence of the institution-wide change. Some centrally
determined changes will limit teachers’ scope to enhance teaching in
contextually sensitive ways, and will make things worse.

I think we all recognise that a “one size fits all” approach does not always work. However  we have to ensure that when we do identify changes to be implemented across the insititution, that we identify when exceptions are valid, and equally when they are not!


10 An increasing number of institutions are using data to track progress
in emphasising the ‘institutional USP’. They are marketing themselves as
distinctive in relation to a particular indicator, such as employability, and
emphasising that variable in programme-level learning outcomes and in
institution-wide quality enhancement efforts, and then collecting better
data than are currently available in order to monitor progress.


11 In light of the prominence given to overall student satisfaction data in
KIS and league tables, it is not surprising that institutions are addressing
‘satisfaction’ issues with vigour. This may be less to do with teaching than
with consistently high standards of service delivery. In some cases these
two domains of quality overlap, as with policies and practices concerning
assignment turnaround times. Many institutions have a range of initiatives
designed to improve service delivery, using NSS data to target efforts.

Yep – we’ve all got those! But the really interesting thing about consistent times for feeding back on assignments is this – even when we know we meet our targets, even when we tell our students what we are going to do, we still receive poor results for feedback! There is still a perception gap around what we mean by feedback, and what we consider to be timely, and what our students think.

 

12 While there is a sense in which students are being treated as consumers
of a product, institutions with good and improving NSS scores often have
initiatives that engage students as co-producers of knowledge, or partners
in an educational enterprise. Attempts to improve student engagement are
taking many forms and sometimes involve students having responsibility
for administering and interpreting student feedback questionnaires, and
acting as change agents, and also central support for activities run by the
students’ union that relate to educational provision. It is unclear the extent
to which NSS scores for a programme reflect extra-curricular initiatives of
this kind, but some institutions are behaving as if they are important.


13 One focus of attention of the interviews undertaken for this report was
whether institutions are focusing on ‘value for money’ by paying renewed
attention to using cost-effective teaching methods in order to deliver
a good quality of education given the level of fees and other income.
There seems to be plenty of evidence of a squeeze on resources, and
adoption of practices that save money, but not of an equivalent focus on
using more effective methods. There is a need for a national initiative on
cost-effective teaching so that, where reduced resources force changes
to teaching practices, it might be possible to maintain or even to improve
student learning.


14 Some of the institutions that are charging the lowest fees are suffering
from competing demands to maintain or enhance their research efforts
in order to retain research degree awarding powers. Attempts to improve
teaching quality in such contexts face challenging conflicts of interest.

“What Works” – HEA/Paul Hamlyn Foundation project on student retention and attainment

Staffordshire University is a partner in this project, and three of our discipline areas will be working specifically on clearly defined projects over the next three years:
-Business Management
-Music Technology
-Engineering

We recognise the importance of the development of “belonging” in the first year of study. We believe that the inculcation of a sense of cohort belonging (through involvement, engagement and connectedness with the university experience, teachers and peers) is key to adapting to change.
Our focus will include the classroom environment and the core practices of education as key influencers on student experience and success reflecting the salience of the related notions of a learning community and transition pedagogy.
We will therefore focus on the transition to higher education and how students learn to engage with the academic sphere, maximising the opportunities for students to develop a sense of belonging and identity by improving the range of support mechanisms that we offer. In parallel to this, we will develop and make readily available more robust data on student engagement and attainment.

The proposed work streams across three selected programmes are as follows:

• Revise personal tutoring to improve relationships between students and staff and to support development of graduate attributes (one award initially)
• Developing knowledge culture and identity of students through engagement with the Staffordshire Graduate attributes programme (one award initially)
• Develop mentoring for level 4 students by level 5 (one award initially)

For all programmes we will:

• Review and improve pre enrolment communication and activities
• Review and improve welcome semester activities
• Improve data systems to support better tracking of student engagement and success
• Pilot the use of a student engagement survey based on the Australian AUSSE

Outcomes realised through participation:

• Students will develop a better sense of belonging to HE and to the institution with stronger relationships with academic staff and especially with their personal tutors which will encourage greater engagement with learning and teaching activities.

• Academic staff will be more engaged with their personal tutees, and be able to support the further development of academic community

• Improved data reports will be developed to support academic and faculty management staff to understand and recognise overall trends in withdrawal, retention and success.

• Improvements in retention and success at award level will be measured through our portfolio performance review system.

Expected quantitative measures of outcomes:

• Reduced withdrawal rates, especially at level 4
• Improved progression rates through all levels
• Improved percentage of students gaining 1sts and 2(i)s
• Improved results in internal/national student surveys
• Evidence base of student engagement

Degree Classifications 2011-12

“Statistical First Release 183 – Student Enrolments and Qualifications” published by HESA on 10-1-13 provides early information on student enrolments and ion particular student attainment last year.

(from http://www.hesa.ac.uk/content/view/2667/393/)

Of those gaining a classified first degree, the proportion who obtained a first or upper second has shown a steady increase from 61% in 2007/08 to 66% in 2011/12.

67% of first degrees undertaken through full-time study in 2011/12 achieved first or upper second classifications compared to 53% of those undertaken through part-time study.

Staffordshire University manages to award about 55% of “good degrees” in 2011-12, which is an increase on the previous year, but still significantly behind the national average and competitors.

Clearly, this does have an impact on league tables, where it is used in the “value added” calculation for example in the Guardian. Ultimately it could have an impact on recruitment – where would you choose to go – the university with the highest or lowest probability of getting a 1st or 2(i)?

Time to start blogging again….

Tine to start blogging again – I read so much stuff, internal and external, that I need a place to keep my my thoughts and the comments of colleagues – for the moment, this seems like the best place to do that.

My main work this year will be about:

  • improving student experience
  • improving undergraduate retention rates
  • understanding league tables, and the factors that go to make them up
  • developing models of the performance of the academic portfolio

So – any posts here are likely to be about those 4 main areas, plus any little gems I pick up published elsewhere.

MOOCs – the battle lines are being drawn

MOOCs are hitting the HE news, and flooding my Twitter stream again.


Inevitably there are two camps – the pros and the antis.


A quick summary – US universities working with Udacity, Coursera and others have already entered the MOOC arena. Until recently statements were being made about major investors not knowing who to go to in the UK to do the same. It’s still expected to be big though.


This has now been dealt with, with the formation of FutureLearn, where Elite institutions will team up with the Open University to offer free internet courses to rival US programmes Coursera and edX.


Now the arguments can start – what is the benefit of engaging with MOOCs?

What is the likely income stream or financial model for universities?

Is this development going to bring to an end University education as we know it?


On the one side we have Clay Shirky – who prophesies that this is the Napster moment for universities.


Rebutting him are the likes of David Kerohan and Patrick McGhee

BME Attainment Gap

An article in the Higher, entitled “Mind, don’t dismiss, the BME attainment gap” refers to the difference in attainment by different groups of students.


Some really frightening stats here:


 “Figures released this week by the Equality Challenge Unit show that 69.5 per cent of UK-domiciled white students achieved a first or a 2:1 in 2010-11, compared with 51.1 per cent of BME students.

The gap was even wider for black students, with only 40.3 per cent scoring a first or a 2:1, according to Equality in Higher Education: Statistical Report 2012, published on 20 November.”

Worryingly, this is reflected across many UK HEIs, and I know that the stats for Staffordshire University show similarly worrying trends.

I think it’s time to identify what we can do to tackle this.