Learning Lessons from School

Often at this time of year, I write a “review of the year” blog piece, summarising my writing over the year, and what HE and my own university in particular have gone through. That all changes this year – my university and I parted company in March (I think I’m allowed to say that) – so there is little to review from that perspective.

However, it was still a big year in HE – a new Higher Education and Research Act, the first gold silver and bronze TEF awards, and now a consultation on the role of the new OfS.

In all of these, the importance of student experience, and particularly use of metrics to demonstrate how well a university performs is paramount, ostensibly to allow prospective students to make choices, but most likely to allow league table compilers, journalists and others to make (specious?) comparisons.

So, if I can’t write much about my experiences of HE this year, what can I share? Since September I’ve spent most of my time both as a postgraduate student, and learning to teach in secondary school classrooms, and there are three lessons I can take from there that, if I were a subject lead in HE, I’d be considering.

I fully accept that university lecturers and school teachers do different jobs. But if you’re in a teaching intensive post-92 university, chances are that some of what you need to do is not that different from a teacher in a local academy or school.

Lesson One – Assessment and Feedback

Pretty much every university and course gains poor scores in the National Student Survey for this. Learning and Teaching committees agonise over it, develop complicated feedback procedures and guidelines, set minimum times to return marks (15 to 20 days), but wonder why students are still unhappy about feedback.

Maybe it’s because they compare with their school experience. Assessment happens constantly. Formative assessment (peer or self assessed maybe) in nearly every lesson. Personalised verbal and written feedback every couple of weeks. If students take a test, marks and feedback are returned within the week. This is the expectation of students – universities could think about how to develop a transition to enable them to adjust to university approaches, but equally support them in those early, vulnerable weeks.

Lesson two – Learning from Teaching Observation

Peer (or heaven forfend, management) observation of teaching in universities is, at its best, a collaborative experience, sharing good practices and relying on professional approaches to self development. However for many this is based on being observed for possibly one hour a year and maybe observing others for maybe two or three hours.

New lecturers may be observed a little more in their first year as part of any post graduate certificate they are taking.

Contrast this with the school experience. As a trainee teacher I am currently observed teaching for 10 hours a week. Every session that I deliver. And at the end of each session, there is a four page feedback form where I scored against 8 standards, with 3 levels of competence. Advice is provided on what worked well, and what I can do to improve. Targets are set for the next week’s season, and I have to provide my own reflection. Guess what – my teaching improves each week, and I develop new techniques and new ideas from the advice I get. I also go and observe a range of other people whenever I want to.

If we want to make big changes in teaching practice, and expose people to a greater number of great teachers and different ideas, then the annual observation round leaves much to be desired, as does the way in which new lecturers are suppported.

Lesson 3 – Know your Students

It’s well now that a sense of belonging aids student retention and attainment, as evidenced in projects such as the Paul Hamlyn/HEA What Works project, as well as universities’ focus on developing course identity.

I talked to a university undergraduate course leader the other week, with maybe 30-40 students in each year. They said that they did not know the names of all their final year students, let alone the others. In school, I’m expected to know the names – and use them – of all my pupils. So in a class of over 30 11 year olds, I can talk to each of them individually, and know a little about them and their abilities. And importantly, I should be able to do this within a couple of weeks of meeting them.

If this is what pupils are used to in school, it’s maybe not surprising that they don’t feel a sense of belonging when they first come to university.

Conclusion

So what’s to be done?

Firstly, I don’t think universities need to replicate schools: the two sectors have different functions, cultures and behaviours. However, a really hard look at transition to university, again particularly in teaching focussed universities whose students may lack some of the cultural capital to thrive instantly, could provide a way of maximising student engagement and attainment.

Seriously, why not send your teaching staff to spend two weeks in a secondary school, shadowing a teacher, and seeing what the school experience is these days – and then reflect on this to see if you could develop your first semester, in terms of formative assessment, teaching practice and sense of belonging to really help your students.

Does UK HE have a retention problem?

Last night I attended an event at King’s College London, hosted by UPP Foundation and Wonkhe, looking at retention issues in UK higher education. The format was a series of initial thoughts from each of 5 panel members, followed by a lively discussion, showing the importance of this topic.

wonkheupp

Richard Brabner

Richard introduced this as the second of three workshops on student journey. He pointed out that  HESA stats on  non-continuation show that this is getting worse, and especially for students from disadvantaged backgrounds. he reminded the audience that in light of this Les Ebdon of OFFA expects next access agreements to focus on retention.

Liz Thomas

Liz stared by explaining that UK figures for retention are in fact much better than most European countries. In those countries with free tuition, then there was a feeling that getting students out of the system was part of the quality system. In awold domintae dby fees and student loans, then this attitude cannot prevail. We admit students to our courses and so we have obligation to help them succeed. So we do have an issue around student success and retention, in particular around differential levels of success, retention and  plus employment outcomes when we consider BME, WP and other factors.

From the HEA/Paul Hamlyn What Works  project it was clear that learning and teaching is critical to student success and retention by building a sense of belonging in the academic sphere. This goes beyond curriculum, but is about the whole institution recognising that it needs to make students successful, and needs to consider role and contribution of all staff.

Sorana Viera

Sorana of the NUS believes that the UK HE does have a retention problem for some groups of students and suggested that an unforeseen consequence of TEF is that game-playing to satisfy the metrics could exacerbate the situation.  The NUS view was that the rushed nature of TEF potentially leaves dangerous holes. Since the key metrics that universities can impact is non continuation then all eyes should be on retention.
Universities should  invest more in those supporting activities that are evidence based, and Soranna cited the What Works project as an example of this. If evidence is presented in accessible ways, then NUS will champion it.

In particular, the impact for commuting students was raised – these are students with financial pressures, family and work commitments, who may have chosen to study at a local university which may not be the right university for them.

Alex Proudfoot

Alex showed that some of the issues for alternative providers are quite different. Students are much more likely to be from a  BME background, or be aged over 30, so these providers are dealing with very different cohorts of students.

A focus for alternative providers was on delivering courses that focus on employability by creating industry links and ultimately an industry community within the college where staff and students might collaborate on projects outside of class.

In terms of pathways and transitions into HE, students who go through the same provider from level 2 and 3 have better retention at HE levels.

For students with low entry qualifications, then classes on study skills are a compulsory part of curriculum, rather than be in an additional optional choice for the student

Ross Renton
Ross highlighted the huge differences in retention and success based on ethnicity. he emphasised the need to develop an understanding who is joining your university or couerse, and developing a relationship with them before they arrive or join the course.

At Hertfordshire they had previously targeted POLAR quintile 1&2 students on entry, and provided peer mentoring plus other additional activity,tailored to each student. Retention figures improved by 43% for these students, and DLHE shows better rate of graduate employment. This intensive personalisation works but is expensive

Ross also highlighted the fact that problems need to be owned by everyone – it’s not a matter of sending a student off to some student hub, but all academic staff need to take ownership. There is also a need to systemise personal tutoring, so that key and meaningful conversations take place at the right times for all students, including at all transition periods, long holidays etc.

In the future Ross saw some risk in being overly focused on the use of metrics and analytics – this is still about people working with people.

Panel Q&A

Key points in the Q&A session were around:

  • How do we support hourly paid lecturers- not delivering HE on the cheap, but supporting the right staff properly
  • The current retention metrics don’t allow for students to step out of HE with interim quals in a flexible framework
  • Staff also need to feel that they belong, so need to consider institutional culture.
    How do you support students through whole institution approach.
  • How can we build success in L&T including retention and success into reward and recognition for staff?
  • How do we making the campus more “sticky” for students living at home? The research on commuting students suggests that these students feel the campus is not for them and they feel marginalised and invisible. Details in prospectus will cover accommodation but not local travel. Universities were often not set up to  support these students, expecting them to be in 4-5 days a week.
  • Tax burden for those who drop out but have student debt – ethics and who should pay? 1 yr of study should be seen as a success
  • Can we use analytics to create better informed interventions as otherwise it is difficult to personalise in mass system without good real time information.

Takeaways

Certain key factors stand out:

  • The need to look carefully at differential retention and success, and to ensure that TEF does not drive perverse behaviours
  • The opportunities to use better analytics to personalise student support
  • The need for rigorous and meaningful personal tutor systems
  • A pressing need to understand how a sticky campus can support commuting students and meeting their specific needs.

 

Non Continuation Rates

Last week, HESA published their latest data on student continuation rates.. An important set of figures for a number of reasons: non-continuation is something that directly affects the finance of universities; non-continuation is potentially a failure for the individual as well as the institution, and finally this data is used in some league tables.

A concern is that overall, the non continuation rate has risen across the sector (and indeed for us at Staffordshire University), with the national figure rising from 5.7% to 6.0% of students who entered in 2013-14 not progressing to the second year.The headline statistics are

  • 6.0% of UK domiciled, young, full-time, first degree entrants in 2013/14 did not continue in higher education in 2014/15.
  • 10.2% of UK domiciled, full-time, first degree starters in 2013/14 were projected to leave higher education without gaining a qualification

Usefully, HESA provides breakdowns of the data by both age of students as well as POLAR3 low participation indicator. This doesn’t necessarily provide any greater detail than that already held by any individual institution, but it does allow for comparisons to be made against comparators.

Looking at the data for Staffordshire University we can see that :

  Percentage no longer in HE (%) Benchmark (%)
young entrants 12.2 10.1
mature entrants 14.1 13.8
all entrants 12.8 11.4
     
young entrants from low participation neighbourhoods 15.7 11.2
young from all other neighbourghoods 11.2 9.7

So, no surprises there, but it does add to weight to the argument that we should revise the way in which we look at the necessary interventions to support retention. If, as is evidenced here, there are groups of students who are more likely to withdraw than others, then a “one size fits all” approach to student retention will not deliver all the necessary outcomes.

In addition, HESA provide data on non continuation rates based on subject studied as well as entry tariff and types of qualifications. The rates compared to entry are summarised as:

Entry qualifications All subjects
   
01 A level/VCE/Advanced Higher grades AAAA or Scottish Highers grades AAAAAA 1.4%
02 A level/VCE/Advanced Higher grades at least AAA or Scottish Highers grades at least AAAAA 1.8%
03 A level/VCE/Advanced Higher grades at least AAB or Scottish Highers grades at least AAAAB or AAAAC or AAABB 2.5%
04 A level/VCE/Advanced Higher grades at least AAC 3.1%
05 A level/VCE/Advanced Higher grades at least ABB or Scottish Highers grades at least AAABC or AAACC or AABBB or AABBC 3.1%
06 A level/VCE/Advanced Higher grades at least ABC or BBB or Scottish Highers grades at least AABCC or ABBBC or ABBBCC or ABBBB or BBBBB 3.9%
07 A level/VCE/Advanced Higher grades at least ACC or BBC or Scottish Highers grades at least AACCC or ABCCC or BBBBC or BBBCC 3.9%
08 A level/VCE/Advanced Higher grades at least BCC or CCC or Scottish Highers grades at least ACCCC or BBCCC or BCCCC or CCCCC 4.2%
09 Tariff points > 290 4.8%
10 Tariff points > 260 5.3%
11 Tariff points > 230 6.6%
12 Tariff points > 200 7.4%
13 Tariff points > 160 9.2%
14 Tariff points > 100 11.3%
15 Tariff points > 0 12.9%
17 Level 3 and A level equivalent qualifications with unknown points 13.9%
19 International Baccalaureate 3.4%
20 HE level foundation course 6.1%
21 Access course 11.1%
22 BTEC 11.5%
23 Higher education qualification – Postgraduate 7.1%
24 Higher education qualification – First degree 7.6%
25 Higher education qualification – Other undergraduate 8.1%
26 No previous qualification 24.1%
27 Other qualifications not given elsewhere 17.0%
28 Unknown qualification 32.6%
   
All qualifications 6.0%

Or looking at this graphically:

  
Important lessons from this data? As A level tariff points decrease, then the likelihood of non-continuation increases. Also, for institutions or courses that recruit significant numbers of students with BTEC qualifications, then higher withdrawal rates might be expected

Putting these factors together: age, POLAR3 neighbourhood, subject and entry grades, we can use better data analytics, linked to market segmentation and enhanced personal tutoring, to identify how to provide  right support to all students, but in a way that is tailored to their needs and expectations. The key part of this will not be the identification of possible at risk students – the more difficult work will be in deciding what are the interventions needed to support an increasingly diverse range of students, and how to deliver this.

Ultimately, we want all of our students to succeed, and if we have decided that these are the people that we want to educate, then we have to provide the best opportunities for that success.

 

All Watched Over by Machines of Loving Grace

This article tries to draw some distinctions between using management data or business intelligence, and the use of “big data”, with some caveats about the latter and the possible blind trust in numbers from the less-than-numerate.

big-data

Last week I went to a demonstration of a piece of a piece of software to help with student retention. There are some great things that the tool allows – integration with student information systems (including SITS), access to VLE analytics; the ability for any member of staff to flag a concern about a student. In addition to that however, the system looks at the last three years worth of retention data, looking at who withdraws and why and then  predicting correlations (if not causality).

So far, so good. I’m a big fan of exploiting data that we have available to us, to allow us to perform more effectively and successfully.

For example, looking at national data, we can identify how well we perform as an institution compared with others, either overall, or in individual subject areas. From this we could identify how successful we are in recruitment, or in degree outcomes

At a more granular level, ,we can look internally at portfolio performance information, to see how academic awards perform overall compared to each other – how overall retention rates or good degree outcomes compare between subjects. At a lower level of granularity, we look at the marks achieved on individual modules, their distribution, and how they compare to each other.

All of this provides simple and useful management information (or at the least granular level, business intelligence) which can help us to improve what we deliver, and improve the outcomes for our students.

What it does not do is provide a “big data” approach to education.

With enhanced student information, linked to personal tutoring or coaching we could start to look at how we could support individuals better, to identify their likely outcomes and to support them in achieving them. This is still a management information approach.

Going to eh next stage though, of profiling students, based on their various individual characteristics is where the water starts to be muddied.

We cloud provide information to tutors on information such as: entry qualifications; attendance; engagement with the VLE and marks obtained. In addition we also hold information on age, ethnicity, gender, socio-economic class, first generation HE, distance from home and many others. Individual staff may not be able to make any inferences from this themselves, but an algorithmic approach could.

Considering retention, the big data approach would look at all of this, and provide algorithms to identify a risk factor for students withdrawing. It could use a traffic light system – red, amber and green, with those scoring red as being most likely to withdraw.

Kate Crawford of MIT and writing a blog for the Harvard Business Review says:

But can big data really deliver on that promise? Can numbers actually speak for themselves?

Sadly, they can’t. Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations. Hidden biases in both the collection and analysis stages present considerable risks, and are as important to the big-data equation as the numbers themselves.

Depending on how the algorithm has been decided, we would then decide where to focus our interventions. Assuming that there will always be withdrawals, maybe we would’t intervene in studnets flagged as red, a their probability of withdrawing is high?

We’d need to look behind the algorithm, These are not as agnostic as the purveyors of technology might have us believe. If we found that students with BTEC entry qualifications were more likely to withdraw, we might flag them as a concern. However, we also know that students of a BME background are more likely to have a BTEC qualification. Our  algorithm might now have produced an unintended consequence of flagging these students as a high risk of withdrawal, and our policy might possibly even limit the interventions we might use.

If we adopt a big data approach, just to this simple aspect of HE, further questions arise for me:

  1. What information do you share with teaching staff – do they see the colour coding?
  2. What do you share with students – do they know how they have been categorised?
  3. How easy is it to change categorisation?

The HE sector has plenty of data to use, some of it could be treated as “big data”, and although  it might be useful to identify some correlations, unless we include human agency in our decisions then we cede control to a series of computer algorithms. We have to be prepared or able to challenge the outputs, and must not naively trust any set of numbers we are presented with.

I’ll finish with a couple of quotes from David Kernohan of JISC:

After all, if big data can reduce every problem to a bar chart, you don’t need people to choose the option that the machine tells you will make the numbers go up. – See more at: http://followersoftheapocalyp.se/9-things-to-watch-in-2014/#sthash.fvbekXiF.dpuf

 

those of us who wish to continue being knowledge workers need to start making sense of data (and for that matter finance, but that’s maybe another story). If every policy position is “justified” by a slew of numbers, we need more people that can make sense of these numbers. Maths – naturally – is hard and we’d all rather be shopping or watching cat videos. But if we want to understand the decisions that affect the world around us, we need to learn to read numbers and to be confident in disputing them. Policy is now quantitative – we need to get better at teaching people how to participate. – See more at: http://followersoftheapocalyp.se/9-things-to-watch-in-2014/#sthash.fvbekXiF.dpuf

My title, by the way, comes from a poem by Ricahrd Brautigan, and was used as the title of a series of BBC documentaries in 2011.