MOOCs and KIS event at University of Hertfordshire

A week or so ago, I wrote an article here of resolutions for the academic year, and I said there would be no more articles on MOOCs. I was being economical with the truth.

Last week I was invited to speak at an event on Quality Assurance and Quality Enhancement in e-Learning, and event supported by the Higher Education Academy, to provide my thoughts and experiences of MOOCs.

As well as MOOCs, we also heard about issues relating to KIS.

Moocs and Quality Issues

Members of the QAQE steering group (Helen Barefoot of University of Hertfordshire and Jon Rosewell of the Open University) provided an overview of MOOC issues as follows:

We started with a definition of a MOOC including xMOOCs vs cMOOCs and the initial 2008 developments based on connectivism at theUni of Manitoba, moving to xMOOC platforms of Coursera and Udacity, used at San Jose and Georgia Tech, and finally EdX at MIT and Harvard.

A further series of providers was identified: Eliademy, Open2study and FutureLearn, the UK provider which will go live on 18th September.

The broader context was considered, in particular what was meant by “open” –

  • Open source?
  • Open licensing Creative Commons?
  • Open content in YouTube etc?
  • Open universities and distance learning?

The question “Why bother with quality?” was addressed, with the following stakeholders considered to be important.

  • Students
  • Employers
  • Authors
  • Institutions
  • Funders
  • Quality agencies

 

However, tensions were identified when comparing a massive course to a more traditional one, which had challenges both for quality assurance and quality enhancement

  • Delivery – f2f or distance
  • Accreditation – credit or none
  • Price – at cost or free
  • Entry – selective or open
  • Scale – personal or massive
  • Support – intensive or not supported
  • Pedagogy – constructivism or transmission
  • Teacher – star or anonymous

 

When considering the ideas of “massive” it was suggested that the issue is not large size, but scale independence (and the scale independence needs to financial, technical and pedagogical). When considering learning design at scale, the the following need to be considered: individual learning, small group collaboration and the impact of large communities.

When considering MOOCs, we need to be aware of what we mean also by “open” online” and “course”. When making a comparison to the courses we are more used to, then size, goal, learning outcomes,measures of completion and retention, and course structure all start to have different meanings. Since MOOCs are actually aiming at different learners, then they become different from usual uni course and don’t always fit with normal measures of quality or success.

The QAA view was portrayed as a clear message to safeguard quality and standards, based on the quality code and applied to all learning including MOOCs (Code chapters b1 b3 b5 b6). This was different from a message I had previously heard, where QAA were interested only if courses were offered for credit.

Experiences of MOOCs

Following this, I presented my view of MOOCs, sharing personal experiences of the courses I have completed, and providing a critique (naturally) of the neo-liberal technological solutionist approach being offered. I concluded with the PA Consulting report that showed that heads of institutions in the UK were not convinced that this was a disruption. I think I might agree with them.

My Experience of MOOCs -Herts Uni video

Key Information Sets

A session on the early evaluation of KIS, by Catherine Benfield of HESA provided the following information:

  • New data and features available from 19th September
  • Data collection reopens for updates the day after and the site will be refreshed once per week
  • Data set hosted by HESA for download as XML file
  • Scale of review next year will be a lot less, just small practical updates

 

However there is also an ongoing UK wide review of information provision being led by HEFCE with 6 strands of review

  1. Advisory study on how students use the data and make decisions about studying
  2. NSS – review of purpose
  3. NSS – detailed of results since 2005. Report in spring 2014
  4. Review of unistats website. Report in autumn 2014
  5. How to improve info on salaries and employment outcomes
  6. Strategy overview

The question was also raised – “should we just allow the market to produce websites?”, so that HESA would collect data and the market (for instance companies such as Which?) could produce the advisory websites. An interesting idea.

The final presentation was on an early evaluation of user experiences of KIS, by Moira Sutton of University of Derby which found:

  • Participants generally positive
  • End users more positive than professional users eg HE staff
  • Mostly people get there direct via URL
  • Little traffic from anywhere except widgets
  • Mainly used by prospective and current HE students and HE staff !
  • Participants were most interested in entry, course content, quality of experience and employability and the last of these was the most important.