The Gender Gap at Universities

More from the Guardian Datablog

This time the table below shows student numbers in 2010-11, by gender, and sorted in order, of institutions with the highest percentage of male students.

Nationally 55% of undergraduates are female, and the trend is for the number of female students to continue to increase.

Number and proportion of students at higher education institutions, by gender 2010-11
Institution % female full-time undergraduates % male full-time undergraduates
1 Leeds College of Music 21.4 77.9
2 Imperial College of Science, Technology and Medicine 34.0 66.1
3 Loughborough University 38.0 62.0
4 Swansea Metropolitan University 41.2 59.0
5 Heriot-Watt University 41.3 58.7
6 Ravensbourne 42.9 57.1
7 Southampton Solent University 43.9 56.1
8 The University of Bath 43.9 56.1
9 The University of Buckingham 44.4 55.6
10 Staffordshire University 44.5 55.4
11 Scottish Agricultural College 44.2 55.2
12 The University of Portsmouth 45.2 54.8
13 Birkbeck College(#9) 45.2 54.8
14 Heythrop College(#9) 45.7 54.3
15 The University of Oxford 45.8 54.2
16 Brunel University 46.1 53.9
17 Royal Northern College of Music 47.5 53.5
18 The Liverpool Institute for Performing Arts 46.7 53.3
19 The University of Cambridge(#12) 46.9 53.1
20 Royal Agricultural College 47.7 52.9
22 Aston University 47.2 52.8
23 The University of Bolton 47.3 52.7
24 The University of Warwick 47.6 52.4
25 Aberystwyth University 48.0 52.0
26 University of Abertay Dundee 48.3 51.7
27 Coventry University 48.4 51.6
28 Royal Academy of Music(#9) 49.3 50.7
29 Royal College of Music 49.3 50.7
30 The University of Newcastle-upon-Tyne 49.3 50.7
32 Harper Adams University College 49.1 50.6
33 The University of Sheffield 49.8 50.2
34 University of Glamorgan 50.0 50.0
35 Guildhall School of Music and Drama 50.0 50.0
36 The Open University 50.0 50.0
37 London School of Economics and Political Science(#9) 50.3 49.7
38 University College London(#9) 50.7 49.3
39 The University of Strathclyde 50.9 49.1
40 Swansea University 51.0 49.0
41 Leeds Metropolitan University 51.3 48.7
42 The University of Leicester 51.5 48.5
43 University of Durham 51.5 48.5
44 Kingston University 51.9 48.1
45 Queen Mary and Westfield College(#9) 51.8 48.1
46 The University of Hull 51.9 48.1
47 The University of Bristol 52.1 47.9
48 The University of Bradford 52.3 47.6
49 Rose Bruford College 52.5 47.5
50 Liverpool John Moores University 52.7 47.3
51 The University of Lancaster 52.7 47.3
52 The University of Huddersfield 52.8 47.2
53 The University of Plymouth 53.1 46.9
54 The University of Liverpool 53.1 46.9
55 University of Hertfordshire 53.2 46.8
56 The University of Salford 53.2 46.8
57 The University of Exeter 53.2 46.8
58 The University of Manchester 53.3 46.7
59 University of Wales Trinity Saint David(#7)(#8) 53.3 46.7
60 The University of Southampton 53.4 46.6
61 Bournemouth University 53.4 46.6
62 The University of Kent 53.4 46.6
63 The University of Aberdeen 53.5 46.5
64 The University of Essex 53.5 46.5
65 The University of York 53.5 46.5
66 The University of Reading 53.6 46.4
67 The University of East London 53.6 46.4
68 Teesside University(#8) 53.8 46.2
69 University of Derby 54.0 46.0
72 The Nottingham Trent University 54.0 46.0
73 University of the West of England, Bristol 54.1 45.9
74 Glynd?r University 54.0 45.8
76 The University of Sunderland 54.4 45.6
77 University of the Highlands and Islands(#8) 54.4 45.6
78 The University of Greenwich 54.8 45.2
79 Sheffield Hallam University 54.8 45.2
80 The University of Wales, Newport 55.0 45.0
81 The University of Northumbria at Newcastle 55.1 44.9
82 Total UK 50.9 49.1
83 The University of Sussex 55.1 44.9
84 Royal Conservatoire of Scotland(#8) 55.2 44.8
85 Oxford Brookes University 55.3 44.7
86 The University of Birmingham 55.3 44.7
87 The University of Nottingham 55.4 44.6
88 Edinburgh Napier University 55.5 44.5
89 The University of Edinburgh 55.7 44.3
90 The University of Keele 55.9 44.1
91 Cardiff Metropolitan University(#8) 56.0 44.1
92 The Manchester Metropolitan University 56.1 43.9
93 The University of Westminster 56.2 43.8
94 The University of Glasgow 56.2 43.8
95 De Montfort University 56.3 43.7
96 The University of Central Lancashire 56.4 43.6
97 The University of Brighton 56.4 43.6
98 The University of East Anglia 56.4 43.6
99 Bangor University 56.5 43.5
101 The Queen’s University of Belfast 56.5 43.5
102 The City University 56.9 43.1
103 Buckinghamshire New University 57.0 43.0
104 London Metropolitan University 57.0 43.0
105 University of Ulster 57.1 42.9
106 London South Bank University 57.2 42.8
107 Cardiff University 57.6 42.4
108 University College Falmouth 57.6 42.4
109 The University of Wolverhampton 57.8 42.2
110 Trinity Laban Conservatoire of Music and Dance 57.8 42.2
111 The University of Surrey 57.8 42.2
112 The University of St Andrews 57.9 42.1
113 University of Gloucestershire 58.0 42.0
114 The University of Lincoln 58.3 41.7
115 The University of Leeds 58.3 41.7
116 St George’s Hospital Medical School(#9) 58.6 41.4
117 Royal Holloway and Bedford New College(#9) 58.8 41.2
118 The University of West London(#8) 59.2 40.8
119 Conservatoire for Dance and Drama 59.2 40.8
120 The University of Chichester 59.5 40.5
121 Norwich University College of the Arts 59.5 40.5
122 St Mary’s University College, Twickenham 59.8 40.2
123 University of Bedfordshire 60.2 39.8
124 Middlesex University 60.2 39.8
125 The University of the West of Scotland 60.8 39.2
126 The School of Oriental and African Studies(#9) 60.8 39.2
127 King’s College London(#9) 61.0 39.0
128 Central School of Speech and Drama(#9) 61.2 38.8
129 Anglia Ruskin University 61.9 38.1
130 The University of Worcester 62.2 37.8
131 Glasgow Caledonian University 62.3 37.7
132 University College Plymouth St Mark and St John 62.2 37.6
133 The University of Dundee 62.6 37.4
134 Birmingham City University 62.8 37.2
135 The University of Stirling 62.9 37.1
136 The Arts University College at Bournemouth 63.0 37.0
137 University College Birmingham 63.9 36.1
138 The Robert Gordon University 64.0 36.0
139 Edinburgh College of Art 64.6 35.8
140 Goldsmiths College(#9) 64.3 35.7
141 Canterbury Christ Church University 64.4 35.6
142 Glasgow School of Art 64.5 35.5
143 Writtle College 64.2 35.2
144 The University of Northampton 64.9 35.1
145 University for the Creative Arts 65.5 34.6
146 Leeds Trinity University College 65.5 34.5
147 The School of Pharmacy(#9) 67.1 32.9
148 York St John University 67.2 32.8
149 Edge Hill University 67.4 32.6
150 Bath Spa University 67.5 32.5
151 The University of Winchester 67.8 32.2
152 University of Chester 67.9 32.1
153 University of Cumbria 69.2 30.8
154 St Mary’s University College 70.9 29.1
155 University Campus Suffolk 71.5 28.5
156 University of the Arts, London 72.5 27.5
157 Liverpool Hope University 73.3 26.8
158 Roehampton University 74.6 25.4
159 Newman University College 74.6 25.2
160 Stranmillis University College 77.4 22.6
163 The Royal Veterinary College(#9) 79.2 20.8
164 Queen Margaret University, Edinburgh 79.5 20.5
165 Courtauld Institute of Art(#9) 83.3 20.0
166 Bishop Grosseteste University College Lincoln 80.2 19.8
Institute of Education(#9) 85.7 10.7

 

The Gender Gap at Universities – what students choose to study

Published on the Guardian datablog site today, some data about gender of students, the subjects that they study and their attainment .

The table below shows the changes in student numbers over the last 5 years, both by gender and total number of students

Interesting points? The only subject areas to show overall decline are computer science and combined awards. the latter can be explained by the he move fo so many institutions towards a tighter more managed portfolio of awards. The former is a worry – we know that there is a shortage of well qualified computer scientists, and we have the very real problem of working with schools to get young people to recognise what the subject is  actually about (not ICT!)

Higher education qualifications obtained by students (postgrad and undergrad) in the UK by gender and subject area, 2011/12
Number of female students Number of male students Total students % female % male % change in female students from 2006-07/ 2011-12 % change in male students from 2006-07/ 2011-12 % change in students from 2006-07/ 2011-12
Medicine & dentistry 10650 7555 18200 58.5 41.5 31.7 28.6 30.4
Subjects allied to medicine 68470 17280 85750 79.8 20.2 -2.4 21.7 1.6
Biological sciences 34450 20970 55420 62.2 37.8 23.5 38.8 28.9
Veterinary science 875 255 1135 77.1 22.5 52.2 -3.8 35.1
Agriculture & related subjects 3635 2255 5885 61.8 38.3 24.7 22.9 23.9
Physical sciences 11110 15100 26210 42.4 57.6 18.6 27.5 23.6
Mathematical sciences 4720 6765 11485 41.1 58.9 38.6 25.9 30.8
Computer science 5750 24765 30520 18.8 81.1 -11.1 -0.2 -2.4
Engineering & technology 8595 42085 50680 17.0 83.0 36.0 30.3 31.2
Architecture, building & planning 7340 14405 21745 33.8 66.2 30.8 27.1 28.4
Total – Science subject areas 155585 151440 307025 50.7 49.3 10.5 23.2 16.4
Social studies 46255 27485 73740 62.7 37.3 22.7 21.1 22.1
Law 19585 13480 33065 59.2 40.8 8.0 10.5 9.0
Business & administrative studies 69655 70370 140020 49.7 50.3 41.8 45.0 43.3
Mass communications & documentation 11815 8090 19905 59.4 40.6 27.9 33.4 30.1
Languages 25345 11495 36845 68.8 31.2 14.8 22.7 17.2
Historical & philosophical studies 15000 13170 28170 53.2 46.8 8.3 15.5 11.6
Creative arts & design 37755 23535 61285 61.6 38.4 32.2 30.5 31.5
Education 61430 18915 80340 76.5 23.5 13.7 3.8 11.2
Combined 4100 2705 6810 60.2 39.7 -14.0 -17.4 -15.4

This week’s news on MOOCs

And still MOOCs are the dish of the day, in meetings around HEIs, and in the pages of the Higher.

I have to confess to having decided which side of the fence I sit on (although I might be persuaded to change) – MOOCs may  look great, but they are not for every University to pursue, and they won’t sound the death knell for every University, despite Clay Shirky saying this is the Naptster moment for higher education. Napster changed the music industry  but it didn’t kill it – people still go out to buy the music experiences they need. And these aren’t just digital downloads – a live gig is still pretty important!

Anyway, onto this week’s coverage.

Firstly the VC of Cambridge warns of massive threat posed by MOOCs. saying that ” less prestigious universities that focused on teaching rather than research could struggle in the face of new online courses. For those in the knowledge-transfer system, there are troubled times ahead.”. However, ” online courses did not pose a threat to Cambridge because they could not replicate the debate and discussion central to the university’s tutorial system”. So that’s alright then. Of course, MOOCs also won’t be able to replace what we do in our studios, in our labs, our workshops, our seminar sessions, and yes – in our personal tutorials. If we are smart, what we will do is exploit the existing digital resources out there, and base our proposition on the support we give to students, and the importance of the social aspects of learning.

In the same week in the Higher, “Online study certificates go on sale, but Coursera’s Andrew Ng tells Chris Parr they won’t match traditional degrees“. Coursera has started to charge for accreditation of completion of some of its online courses, but one of the founders of the company recognises that this  will never be as valuable as the currency of a traditional degree from a prestigious university. He does point out the benefit to existing graduates of using MOOCs to top up and refresh knowledge, and to receive certification that have done so.

Leadership in Higher Education – a new publication

An old friend and onetime colleague of mine who publishes a popular blog has written about a recent Leadership Foundation publication on “What do we know about leadership in higher education.”

Dr Greatrix writes that “We seem to be clutching at straws in trying to establish whether there is any evidence for leadership benefiting universities in terms of their core activities:

Evidence of the impact of leadership on the extent and quality of research, learning and enterprise is rather slim.

Moreover, university staff inevitably have contrasting views on what effectiveness means, what its characteristics are and indeed whether individuals can even be described in this way:

What works in one context will not necessarily work ?in another, and equally may be judged as effective? and ineffective in the same context. As in the wider literature, the research generates lists of characteristics ?of effective leaders that are somewhat idealised and apolitical. Oppositional narratives underpin estimates of effectiveness; a rational narrative stresses data-driven, command and control, while an alternative prizes an open- ended and fluid creation of space in which autonomy can flourish. Effectiveness is currently related to individuals, but might be more usefully applied to units.”

This might all be a little depressing, particularly for universities who have invested significant amounts of time and money in leadership development for their senior staff. Personally  I find the short mantra of Rob Goffee and Gareth Jones (authors of “Why Should Anyone be Led by You”) a useful way of viewing leadership – Be Yourself, More, With Skill.

 

 

 

More on BME Attainment

One of the reasons I look at this is because I am tasked with trying to increase the number of “good” degrees, ie 1sts and 2(i)s that we award.

It is obvious that if certain groups of students are less successful than others, then we need to understand why, and in so doing make sure that that all of our students have the same opportunities to succeed.

The Higher (24th January 2013) in an article “Black students reluctant to seek aid”, suggests that a reason for lower attainment might be because of a reluctance to seek help from lecturers. the suggestion is that unversities need to be more proactive in ensuring that black students access the academic support on offer.

 

In an era of increasing class sizes, this will be a challenge – if we can develop personal tutoring systems or encourage enough small group teaching  with formative feedback opportunities, we may be better placed  to identify when any of our students need extra support.

More details of the research carried out by Jacqueline Stevenson of Leeds Met can be downloaded from here.

Different Approaches to KIS

Next week I am delivering a session to one of our faulty management teams on Key Information Sets (KIS) – specifically trying to help colleagues generate a better understanding,  especially of what data can and can’t be influenced by the university.

Ultimately, all of the factors reported can be influenced – they only show what has been previously measured  either by an institutional response specific to KIS, through HESA returns, or through the opinions and outcomes of our former students.

As Graham Gibbs has pointed out – see earlier post – the factors being measured and reported do not necessarily represent those, that would necessarily lead to educational gains for students  after interventions.

However, I was interested to see what our two main rivals (hem hem) Oxford and Cambridge do….

Cambridge, under each KIS widget, provide a list of reasons why KIS should not be considered in isolation, and may not be the most useful way of making comparisons.

Oxford on the other hand are more direct. They have their own graphic  right above the KIS one with a clear statement of what their students will get. Maybe we could do the same with Staffordshire Graduate?

Oxford KIS button

UCAS Enrolment figures for 2011 and 2012 – lessons to be learnt?

The latest UCAS figures for the last cycles of enrollments and acceptances has been published, and commented upon in the Higher.

“Data released by the Universities and Colleges Admissions Service on 18 January reveal the full extent of shortfalls in undergraduate numbers by institution, showing that the government’s reforms have produced often wild variations in recruitment.

The figures relate to acceptances, meaning actual enrolments could be different. They also include overseas students who applied via Ucas. However, the data do not include students who applied directly (via access courses, for example).

Some in the sector suggest that declining post-1992 university figures indicate that students from disadvantaged backgrounds are most likely to decide against higher education with higher fees”

although

“The post-1992s that increased their intake appear to be largely drawn from those that won places under the margin system, which reallocates places to cheaper institutions.”

 

The table below shows the percentage change between 2011 and 2012, sorted by size of change. Staffs comes out pretty much mid-table this way, and it is interesting to see where our competitors are.

Institution Name 2012 2011 Change in number of places Change %
American InterContinental University – London 56 6 50 833.3
Birkbeck, University of London 664 289 375 129.8
Richmond, The American International University in London 89 40 49 122.5
Regent’s College, London (incorporating Regent’s Business School, London) 78 46 32 69.6
European Business School, London 100 68 32 47.1
University of Stirling 1765 1253 512 40.9
University of Bristol 4717 3688 1029 27.9
University College London 4397 3617 780 21.6
University of Aberdeen 2900 2400 500 20.8
Glasgow School of Art 389 334 55 16.5
Ravensbourne 811 705 106 15.0
Royal Veterinary College, University of London 373 327 46 14.1
Royal Academy of Dance 58 51 7 13.7
School of Oriental and African Studies, University of London 986 870 116 13.3
Cardiff University 5799 5130 669 13.0
King’s College London, University of London 4331 3881 450 11.6
London School of Economics and Political Science, University of London 1416 1271 145 11.4
University of Edinburgh 5474 4951 523 10.6
York St John University 1638 1492 146 9.8
Queen Margaret University , Edinburgh 1107 1014 93 9.2
University of Winchester 1915 1756 159 9.1
Harper Adams University 792 731 61 8.3
BPP University College 1041 962 79 8.2
Aston University 2141 1992 149 7.5
Anglia Ruskin University 4362 4,065 297 7.3
Newcastle University 4669 4357 312 7.2
Coventry University 5432 5107 325 6.4
University of Glasgow 4406 4149 257 6.2
Durham University 4026 3800 226 5.9
University of Ulster 5675 5360 315 5.9
University of Bath 2990 2832 158 5.6
Bishop Grosseteste University 612 580 32 5.5
University of Chichester 1589 1511 78 5.2
Queen’s University Belfast 3920 3736 184 4.9
Robert Gordon University 2423 2321 102 4.4
University of Chester 3064 2936 128 4.4
University of Cambridge 3401 3261 140 4.3
Norwich University Of The Arts 635 609 26 4.3
Central School of Speech and Drama, University of London 233 224 9 4.0
Brighton and Sussex Medical School 141 136 5 3.7
Hull York Medical School 147 142 5 3.5
Goldsmiths, University of London 1800 1743 57 3.3
University of Exeter 4356 4220 136 3.2
University of Strathclyde 2899 2822 77 2.7
Falmouth University 1443 1410 33 2.3
Buckinghamshire New University 2379 2327 52 2.2
University of Huddersfield 4962 4866 96 2.0
Birmingham City University 5308 5214 94 1.8
University of Brighton 5417 5326 91 1.7
St George’s, University of London (formerly St George’s Hospital Medical School) 657 646 11 1.7
Glasgow Caledonian University 3530 3473 57 1.6
Bangor University 2410 2372 38 1.6
University of Oxford 3281 3237 44 1.4
University of York 3749 3701 48 1.3
University of East Anglia 3540 3505 35 1.0
St Mary’s University College, Twickenham 1245 1237 8 0.6
University of Sussex 3221 3203 18 0.6
University of the Highlands and Islands 2161 2153 8 0.4
University of Leicester 3114 3113 1 0.0
University of Reading 2945 2948 -3 -0.1
University of Portsmouth 5289 5305 -16 -0.3
University of Nottingham 7160 7187 -27 -0.4
University of Warwick 3828 3846 -18 -0.5
UCP Marjon 800 806 -6 -0.7
Swansea Metropolitan University 1425 1446 -21 -1.5
Rose Bruford College 213 217 -4 -1.8
University College Birmingham 1312 1337 -25 -1.9
University of Northampton 3017 3079 -62 -2.0
Cardiff Metropolitan University 2646 2704 -58 -2.1
Loughborough University 3359 3439 -80 -2.3
University of St Andrews 1696 1741 -45 -2.6
Staffordshire University 3790 3895 -105 -2.7
Southampton Solent University 3810 3920 -110 -2.8
Heriot-Watt University, Edinburgh 1814 1872 -58 -3.1
University of Manchester 7861 8114 -253 -3.1
Royal Holloway, University of London 2375 2452 -77 -3.1
Oxford Brookes University 3810 3934 -124 -3.2
Lancaster University 2778 2882 -104 -3.6
Courtauld Institute of Art, University of London 51 53 -2 -3.8
Leeds Trinity University 870 905 -35 -3.9
City University 2880 2997 -117 -3.9
London South Bank University 3740 3893 -153 -3.9
University of Abertay Dundee 1242 1301 -59 -4.5
Bath Spa University 1919 2021 -102 -5.0
University of Wolverhampton 4587 4855 -268 -5.5
Arts University College at Bournemouth 953 1011 -58 -5.7
Queen Mary, University of London 3484 3704 -220 -5.9
Keele University 2023 2153 -130 -6.0
University of Leeds 6428 6844 -416 -6.1
Imperial College London 2226 2377 -151 -6.4
University of the West of Scotland 4228 4516 -288 -6.4
Canterbury Christ Church University 3675 3927 -252 -6.4
University of Kent 4942 5281 -339 -6.4
University of Birmingham 5135 5520 -385 -7.0
Nottingham Trent University 6356 6857 -501 -7.3
ifs School of Finance 36 39 -3 -7.7
University of the Arts London 4305 4665 -360 -7.7
University of Gloucestershire 2215 2403 -188 -7.8
Stranmillis University College: A College of Queen’s University Belfast 244 265 -21 -7.9
University of Worcester 2681 2919 -238 -8.2
University of Essex 2907 3166 -259 -8.2
Roehampton University 2254 2455 -201 -8.2
Liverpool Institute for Performing Arts 219 240 -21 -8.8
Kingston University 6210 6809 -599 -8.8
Edge Hill University 3550 3900 -350 -9.0
Northumbria University 5714 6290 -576 -9.2
University of Sheffield 4711 5197 -486 -9.4
Sheffield Hallam University 7425 8211 -786 -9.6
University of the West of England, Bristol 6584 7284 -700 -9.6
Edinburgh Napier University 3480 3854 -374 -9.7
University of Liverpool 3945 4369 -424 -9.7
Bournemouth University 3920 4342 -422 -9.7
Middlesex University 4139 4619 -480 -10.4
University of Hull 4356 4880 -524 -10.7
De Montfort University 4638 5230 -592 -11.3
University of Bedfordshire 3815 4303 -488 -11.3
University of Westminster 4503 5088 -585 -11.5
University of Sunderland 2503 2831 -328 -11.6
Swansea University 2939 3326 -387 -11.6
British School of Osteopathy 69 79 -10 -12.7
Plymouth University 5173 5923 -750 -12.7
University of West London 2394 2742 -348 -12.7
Liverpool John Moores University 5473 6284 -811 -12.9
University of Southampton 4499 5189 -690 -13.3
Brunel University 2821 3279 -458 -14.0
Royal Agricultural College 313 364 -51 -14.0
University of Derby 3002 3555 -553 -15.6
University of Lincoln 3135 3715 -580 -15.6
University of Hertfordshire 4730 5618 -888 -15.8
Manchester Metropolitan University 7642 9083 -1441 -15.9
Liverpool Hope University 1476 1760 -284 -16.1
University of Central Lancashire 5318 6355 -1037 -16.3
University for the Creative Arts 1727 2064 -337 -16.3
University of Surrey 2104 2515 -411 -16.3
University of Salford 3953 4808 -855 -17.8
University of London Institute in Paris 49 60 -11 -18.3
University of Cumbria 1950 2391 -441 -18.4
University of Bradford 2748 3377 -629 -18.6
University of Dundee 2141 2637 -496 -18.8
Newman University College, Birmingham 535 659 -124 -18.8
Aberystwyth University 2655 3283 -628 -19.1
University of Wales, Newport 1147 1426 -279 -19.6
University of East London 4385 5510 -1125 -20.4
University of Glamorgan, Cardiff and Pontypridd 3265 4105 -840 -20.5
Glyndwr University 731 926 -195 -21.1
Scottish Agricultural College 275 350 -75 -21.4
Heythrop College, University of London 149 192 -43 -22.4
Leeds Metropolitan University 6265 8084 -1819 -22.5
University of Greenwich 4034 5223 -1189 -22.8
University Campus Suffolk 1398 1811 -413 -22.8
Peninsula College of Medicine & Dentistry 239 319 -80 -25.1
University of Bolton 1259 1686 -427 -25.3
University of Wales Trinity Saint David 821 1156 -335 -29.0
University of Buckingham 157 244 -87 -35.7
Writtle College 293 458 -165 -36.0
London Metropolitan University 4079 7209 -3130 -43.4
Royal Welsh College of Music and Drama (Coleg Brenhinol Cerdd a Drama Cymru) 31 60 -29 -48.3
Institute of Education, University of London 28

Implications of ‘Dimensions of quality’ in a market environment

Graham Gibbs has published, through HEA, a further paper on Dimensions of Quality, now considering the market environment in which we supposedly operate and some of the implications, which are reported below, together with some of my comments. The original paper can be accessed here.

1 This report concerns the practical implications of the use of
performance indicators for the way institutions are currently attempting
to attract students, improve quality, improve ‘value for money’, and
improve their relative standing in relation to educational provision.
Institutions are responding to this data-driven market in a variety of
ways, some of them perhaps unexpected and some with probably
negative consequences. The report suggests ways in which the use of
data in a market could be tuned up to have more positive effects.


2 The conclusions of the report are based on:
• examination of the data currently available to students and used by
institutions, and their validity and usefulness;
• literature about performance indicators in higher education, and also
literature about the effect that performance indicators and markets
have on the behaviour of organisations in any public sector, such as
schools and hospitals;
• meetings with those senior managers responsible for educational
quality within institutions, both in national gatherings and through
interviews within 12 institutions of a wide variety of types;
• examination of institutional documentation, for example about how
quality data are reported and used internally, and institutional responses
to the Browne Report.


3 It is not yet clear whether institutional attempts to improve National
Student Survey (NSS) scores and other quality indicators is having any
effect on student recruitment, let alone on learning gains. To a large
extent the market is perceived to be driven by reputation, just as in the
past. US research shows that reputation tells you almost nothing about
educational quality, use of effective educational practices, or learning
gains, but merely reflects research performance, resources and fee levels.
It is uncertain whether the use of more valid indicators of educational
quality will gradually change perceptions of what reputation is about, and
turn it into a more useful guide to student choice.

 

4 Data currently provided to potential students, such as Key Information
Sets (KIS), and used by institutions to make decisions, include some
valid indicators of educational quality and also include variables that are
invalid or difficult to interpret. There is scope to improve the value of the
information provided to students, and used by institutions, by changing
some of the variables and collecting and collating somewhat different data.
In particular it is not yet possible for students to see what educational
provision their fees will purchase (such as class size, which predicts learning
gains) other than the proportion of class contact hours (which does not
predict learning gains).

Is this an area of opportunity for individual institutions, to make the most of information that they hold regarding class sizes, and who is doing the actual teaching?
5 The aspects of educational provision that institutions pay attention to in
their internal quality assurance processes often overlook crucial indicators.
Any new quality regime should ensure that it focuses on the right variables,
and the use of valid quality indicators in KIS and elsewhere would help to
lever appropriate attention.


6 Regardless of the validity of currently available data, institutional behaviour
is being driven by data to an unprecedented extent. In most institutions
there is now an annual cycle of analysis of performance indicators at both
institutional and departmental level, followed by planning to improve them,
again at both institutional and departmental level. Departments are much
more aware of how their competitors at other institutions perform, in
relation to the main indicators. In some cases this annual analysis of data
has in effect taken over from periodic review and QAA audit as the main
driver of quality assurance and enhancement (and without this having
been planned or agreed). Any future revision of national quality assurance
mechanisms, and requirements on institutions, will need to take this reality
into account.

We have the opportunity to embed our own portfolio performance review tool into our annual processes, while at the same time reviewing how we carry out internal annual monitoring as well as responses to student surveys. We might be able to do more if we join all of these activities together, rather than seeing them as separate and distinct.
7 Most currently available data are about degree programmes, and students
apply to study degree programmes. In contrast much quality assurance,
and course design and documentation, has focused on individual modules.
In modular course structures the collection of modules that students
experience may relate loosely to the unit of analysis of the NSS. This
confronts modular institutions and modular degree programmes with major
problems in interpreting and acting on the degree-programme-level data
from the NSS. A consequence is that some institutions are greatly reducing
the number of combined Honours degrees offered and moving away from
modularity back to traditional single subject degree programmes with greater
alignment of student experience with the unit of analysis, and labelling, of
public indicators of quality. There are consequences of this shift for the
diversity of curricula and for student choice, which may have negative impacts.

This is an interesting assertion. Many institutions have moved away from joint programmes for other reasons: the confused market offer; inefficiencies in delivery and poor student experience where there can be a lack of belonging . However, it is true that as we reduce portfolio, we do potentially reduce the diversity of curricula within remaining awards. The difficulty of associating awards with published NSS results is well known – more work can be done here to make sure we understand which awards are categorised where (and why!), so that sensible interpretation of published data can be undertaken, and the right changes made.

 

8 There has been a considerable emphasis over the past decade on
training and accrediting individual teachers, rewarding individual teachers,
and on funding local innovation in teaching. There is a marked lack of
corresponding institutional emphasis on the effective operation of
‘programme teams’ (all those who contribute to the teaching of a degree
programme), on developing leadership of teaching, and on curriculum
design and assessment at programme level. A change of focus of national
and institutional enhancement efforts is overdue. Institutional career
structures still need to be developed that reward leadership of teaching,
rather than only individual research and individual teaching. Funding for
innovation, both within institutions and by national bodies, should be
targeted on programmes rather than on modules and on the involvement
of entire programme teams rather than on individuals.

Do we know enough about teams? For instance  all new staff are required to complete a PGCHPE, But what are we doing about experienced staff, and those non-teaching staff who are associated with the programme?
9 Many institutions are using data to identify a previously overlooked quality
problem and address it: the most common example is poor and slow
feedback to students on their assignments. Institutions are then making
very broad scale changes that affect all degree programmes and all
teachers in order to address these problems. Data are successfully driving
change and in some cases there is clear evidence of improvements in NSS
scores as a consequence of the institution-wide change. Some centrally
determined changes will limit teachers’ scope to enhance teaching in
contextually sensitive ways, and will make things worse.

I think we all recognise that a “one size fits all” approach does not always work. However  we have to ensure that when we do identify changes to be implemented across the insititution, that we identify when exceptions are valid, and equally when they are not!


10 An increasing number of institutions are using data to track progress
in emphasising the ‘institutional USP’. They are marketing themselves as
distinctive in relation to a particular indicator, such as employability, and
emphasising that variable in programme-level learning outcomes and in
institution-wide quality enhancement efforts, and then collecting better
data than are currently available in order to monitor progress.


11 In light of the prominence given to overall student satisfaction data in
KIS and league tables, it is not surprising that institutions are addressing
‘satisfaction’ issues with vigour. This may be less to do with teaching than
with consistently high standards of service delivery. In some cases these
two domains of quality overlap, as with policies and practices concerning
assignment turnaround times. Many institutions have a range of initiatives
designed to improve service delivery, using NSS data to target efforts.

Yep – we’ve all got those! But the really interesting thing about consistent times for feeding back on assignments is this – even when we know we meet our targets, even when we tell our students what we are going to do, we still receive poor results for feedback! There is still a perception gap around what we mean by feedback, and what we consider to be timely, and what our students think.

 

12 While there is a sense in which students are being treated as consumers
of a product, institutions with good and improving NSS scores often have
initiatives that engage students as co-producers of knowledge, or partners
in an educational enterprise. Attempts to improve student engagement are
taking many forms and sometimes involve students having responsibility
for administering and interpreting student feedback questionnaires, and
acting as change agents, and also central support for activities run by the
students’ union that relate to educational provision. It is unclear the extent
to which NSS scores for a programme reflect extra-curricular initiatives of
this kind, but some institutions are behaving as if they are important.


13 One focus of attention of the interviews undertaken for this report was
whether institutions are focusing on ‘value for money’ by paying renewed
attention to using cost-effective teaching methods in order to deliver
a good quality of education given the level of fees and other income.
There seems to be plenty of evidence of a squeeze on resources, and
adoption of practices that save money, but not of an equivalent focus on
using more effective methods. There is a need for a national initiative on
cost-effective teaching so that, where reduced resources force changes
to teaching practices, it might be possible to maintain or even to improve
student learning.


14 Some of the institutions that are charging the lowest fees are suffering
from competing demands to maintain or enhance their research efforts
in order to retain research degree awarding powers. Attempts to improve
teaching quality in such contexts face challenging conflicts of interest.

“What Works” – HEA/Paul Hamlyn Foundation project on student retention and attainment

Staffordshire University is a partner in this project, and three of our discipline areas will be working specifically on clearly defined projects over the next three years:
-Business Management
-Music Technology
-Engineering

We recognise the importance of the development of “belonging” in the first year of study. We believe that the inculcation of a sense of cohort belonging (through involvement, engagement and connectedness with the university experience, teachers and peers) is key to adapting to change.
Our focus will include the classroom environment and the core practices of education as key influencers on student experience and success reflecting the salience of the related notions of a learning community and transition pedagogy.
We will therefore focus on the transition to higher education and how students learn to engage with the academic sphere, maximising the opportunities for students to develop a sense of belonging and identity by improving the range of support mechanisms that we offer. In parallel to this, we will develop and make readily available more robust data on student engagement and attainment.

The proposed work streams across three selected programmes are as follows:

• Revise personal tutoring to improve relationships between students and staff and to support development of graduate attributes (one award initially)
• Developing knowledge culture and identity of students through engagement with the Staffordshire Graduate attributes programme (one award initially)
• Develop mentoring for level 4 students by level 5 (one award initially)

For all programmes we will:

• Review and improve pre enrolment communication and activities
• Review and improve welcome semester activities
• Improve data systems to support better tracking of student engagement and success
• Pilot the use of a student engagement survey based on the Australian AUSSE

Outcomes realised through participation:

• Students will develop a better sense of belonging to HE and to the institution with stronger relationships with academic staff and especially with their personal tutors which will encourage greater engagement with learning and teaching activities.

• Academic staff will be more engaged with their personal tutees, and be able to support the further development of academic community

• Improved data reports will be developed to support academic and faculty management staff to understand and recognise overall trends in withdrawal, retention and success.

• Improvements in retention and success at award level will be measured through our portfolio performance review system.

Expected quantitative measures of outcomes:

• Reduced withdrawal rates, especially at level 4
• Improved progression rates through all levels
• Improved percentage of students gaining 1sts and 2(i)s
• Improved results in internal/national student surveys
• Evidence base of student engagement

Degree Classifications 2011-12

“Statistical First Release 183 – Student Enrolments and Qualifications” published by HESA on 10-1-13 provides early information on student enrolments and ion particular student attainment last year.

(from http://www.hesa.ac.uk/content/view/2667/393/)

Of those gaining a classified first degree, the proportion who obtained a first or upper second has shown a steady increase from 61% in 2007/08 to 66% in 2011/12.

67% of first degrees undertaken through full-time study in 2011/12 achieved first or upper second classifications compared to 53% of those undertaken through part-time study.

Staffordshire University manages to award about 55% of “good degrees” in 2011-12, which is an increase on the previous year, but still significantly behind the national average and competitors.

Clearly, this does have an impact on league tables, where it is used in the “value added” calculation for example in the Guardian. Ultimately it could have an impact on recruitment – where would you choose to go – the university with the highest or lowest probability of getting a 1st or 2(i)?