IDEL Assignment: The value of open badges in education

Open badges are already in use by many educational institutes, not-for-profits and companies as a way of visually communicating skills and achievements (Mozilla, 2017). These are a branch of digital badges, differing because they are built from a shared technical framework that allows any organisation, commercial/education or otherwise to issue open badges with consistent attributes.

Digital badges are used in education in several prominent ways (Gibson, 2012; Jovanovic, 2015). Firstly, as a way of motivating students to progress in their learning journey. Secondly, as an alternative assessment medium. Thirdly, as a pedagogical tool to signpost learning journeys. Finally, digital badges are increasingly used as a method for credentialing a learner’s achievements.

To evaluate the value of open badges in education, we must consider each of the ways in which open badges are used, with an eye on the benefit to the different stakeholders interacting with them. We must also consider the specific affordances of open badges compared with non-complex digital badges too.

Open badges as a motivational tool

Gibson (2015) highlights the use of badges as a tool to use as a gamification method, alongside other options such as points and leaderboards.

While “meaningful gamification” (Nicholson, 2012) may be seen as a way of building intrinsic motivation, gamification as a whole is not without its critics (Bogost, 2011), and there are views that empirical evidence to date is yet to yield strong evidence of its impact (Hung, 2017), largely due to the broad definition of the terminology and how it is used in practice.

For teachers and educators considering using digital badges in their course design as a motivational tool, they should consider that “motivation is affected by context” (Hartnett, 2016). Like any educational tool, a strong understanding of the student and their motivations to learn will impact on what and how they deploy this.

Abramovich (2013) highlights how differing levels of prior knowledge, as well as the type of badge used, can affect the usefulness of badges as a motivational tool. It’s perhaps also pertinent to consider the notion of sacrifice (Halavais, 2012) in helping to avoid the undesirable outcome of a badge actually becoming demotivating. This could happen if the way to acquire a badge becomes too easy to attain (and therefore cheapening their value) or on the flip side being too difficult to acquire.

Educators should also be wary of the risk that the acquisition of badges may come at the expense of the learning activity required to achieve them (Jovanovic, 2015).

While there is considerable discourse about the value of digital badges as a motivational tool, there seems to limited insight or evidence about the how the unique attributes of open badges contribute in this area. However, there are examples (Badgecraft, 2015) of how the technical nature of open badges has been thought to encourage participation amongst groups of learners, which in turn may influence the depth, volume and nature of learning. Given this, it is useful to consider the role of community (Halavais, 2012) and “symbolic capital” has within the value of portability of open badges.

Open badges as an alternative source of assessment

Jovanovic (2015) highlights the opportunity open badges present as a new device for assessment, for instance, peer assessment and the recognition of soft skills.

There are advocates for open badges as a tool in this way (Strunk, 2017). Using open badges in this way perhaps inevitably intertwines into their use as a credential, but as an assessment tool, the metadata underpinning the open badge can help provide a verifiable depth to a learner’s progress. This in turn can be used to understand a “transparent narrative of a learner’s knowledge” (Strunk, 2017).

It’s important to note here that open badges have a distinct advantage over standard badges – the use of metadata makes the use here more appropriate. It could be argued however that as open badges have not yet been tested in this area with any great scope, any challenges may not yet be on the radar.

Open badges as a pedagogical tool

Jovanovic (2015) introduces digital badges as a way to “scaffold” a learning experience. As teachers and students, this is particularly pertinent given the understanding of how autonomy can aid student motivation (Hartnett, 2016). As such badges could be seen as a useful instrument to help explore the sweet-spot between choice and guidance. There’s a natural link here into motivation, as badges as signposts may “focus student attention”, and “nudge student exploration” (Rughiniş, 2013).

Again the research into how the specific aspects of open badges factor into this usage seems limited. However to speculate, the common framework could allow institutes to collaborate and signpost other ‘tracks’ that travel out of their own institutional boundaries. This could also widen to incorporate learning analytics, and provide increased understanding of how open badges can influence learning pathways (Strunk, 2017).

Open badges as a credential

While it could be argued there is value in the use of badges in the contexts discussed so far, the evidence for the enhanced value open badges provide seems limited. This criticism could be judged as unfair, as the most “obvious use” (Glover, 2013) for open badges is as a method in credentialing achievements in learning. The focus on this use is perhaps due to the metadata attached to open badges. This provides insight into the “context, meaning, process and result of an activity” (Gibson, 2015), distinguishing them from simple badges. Despite this leaning towards open badges been most akin to being used as a credential, there is considerable discourse in the weaknesses.

There are three perspectives to consider (Kerver, 2016) when judging the value of open badges as a credential. These are the badge holder (or student), the badge issuer (e.g. educational institute) and the badge enquirer (a different educational institute or employer). I will primarily focus on the badge enquirer, as ultimately as a credential, the badge holder wishes to display it for inquiry and it is the enquirer’s viewpoints which are of concern to the student.

The first challenge for an enquirer is to gauge what the badges represent. Given an open backpack can hold any badge from any issuer (for example the portfolio could hold badges of progress, a micro credential badges or a representation of a more traditional qualification), this may cause challenges due to the conflicting rationale for acquisition, or the “monstrous hybrids” (Halavais, 2012). While the learner may have the flexibility to decide on what, how and where their achievements are displayed, there’s no guarantee that a learner would choose to do this, or indeed know what an enquirer wishes to view from their portfolio. Adding to this, there is also the inevitable aspersions caused by any enquirer viewing badges as a motivational tool and not a credential, thus the risk of cheapening the credential (Bull, 2014).

Given the range of badge visualisations that could be employed by different issuers, at first glance it may be difficult to gauge the ‘sacrifice’ (a key contributor to the perception of badge value identified by Halavais, 2012), and the ranking of sacrifice amongst the portfolio. (In practical terms, it’s quite possible that a badge representing a Ph.D. could be sat alongside a yoga participation badge). Bear in mind that a badge, by its very nature, is intended to be a visual shortcut. It seems that while a badge may have symbolic capital within the community it was created, this may hold little sway outside of that community, and ultimately this is the premise of ‘openness’.

It may also be worth considering the replication issues around badges too. As Halavais (2012) quite rightly points out, something is only as valuable as it is difficult to fake. Given anyone can without barrier issue credentials, and with no constraints around design, it could be easy to falsify, and there is a lure for issuers to piggyback the symbolic capital that may be perceived from other issuers in their visualisation choices.

Given the challenges around the context in which badges can be viewed, the visualisation choices and the ability to fake, it seems a logical suggestion that for an enquirer to gauge the value of an open badge, they have to fall back on the elements of the very guardian classes (Halavais, 2012) that the premise of open badges intends to up-seat. The digital artifact included in this essay attempts to represent this ‘vicious circle’ visually.

It’s important to note that the challenges in understanding the value of an achievement (open badge or otherwise) are not a new thing. Enquirers have always had to make judgments based on what a credential represents, whether this is paper-based, online or otherwise. However it seems the construct of open badges does not resolve this problem, and indeed once scaled up will still struggle to articulate what a learner has accomplished, and what this represents.

Putting the student to one side for a moment, it’s important to acknowledge some of the benefits that open badges do provide institutions. The technical infrastructure permits increased control to manage credentials over time; allows them to digitally sign and verify them; provides the opportunity to tie them into learning analytics to gauge enquirer behaviour; and can provide increased brand awareness opportunities for the institute (Kerver, 2016). However, it could be argued that these benefits can only have a significant impact if open badges become the standard. Without universal adoption (which I’d argue the flaws identified above may prevent) then they could simply become another tool to manage. So rather than saving time, could actually increase overhead.

When we consider the value of open badges, one should also consider the fact that it is still an emerging technology. Digital badges 2.0 has just been released (IMS Global, 2017), however, this seems to be focussed around technical enhancements. At first glance, these seem to do little to counter some of the fundamental issues outlined earlier. There’s also a sense of pragmatic acceptance that badge systems will fail before they succeeded (Carey, 2012), however, five years after this viewpoint “the value of open digital badges has yet to be validated by compelling evidence” (Strunk, 2017).

It’s also possible to challenge the notion that open badges are indeed ‘open’.There are many providers of open badge ‘backpacks’ (Hamson-Utley, 2016), and this requires registration with one of these parties. A single point of registration becomes a potential single point of failure, and in practice has been noted as a barrier, albeit small (Hole, 2014). So while badges are portable, they are only portable to the extent of the technology they are carried in allows them to be. There could be increasing concerns about the openness of the platforms themselves for educational stakeholders, with commercial companies like Pearson (Belshaw, 2017) and Salesforce looking to explore this area (Google Groups, 2017).

It’s difficult to envisage a future for open badges as a meaningful form of credentialing without referencing the hierarchy it is intended to disrupt. There are attempts to bring a sense of hierarchy into the visualisation of badges (Belshaw, 2015), however, this may be little more than skin-deep. Perhaps if open badges were to move towards the proposition that they are intended to be a gateway to a portfolio of accomplishments, rather the end point in themselves, this could encourage adoption.

I propose the notion that the very openness of open badges is the very chink in the armour that makes acceptance as a format by learners and enquirers unlikely, and this acceptance is the critical factor in universal adoption. While open badges do have value in several respects, there does not seem to be clear evidence of the increased value of openness over standard badges in these usages, and arguably their greatest advantage (as a credential) could be seen to be flawed.


  • Mozilla (2017) About Open Badges, Open Badges (Accessed: 14 December 2017).
  • Alexander M.C. Halavais (2012) A GENEALOGY OF BADGES, Information, Communication & Society, 15:3, 354-373, DOI: 10.1080/1369118X.2011.641992
  • Gibson, D., Ostashewski, N., Flintoff, K. et al. Educ Inf Technol (2015) Digital Badges in Education. 20: 403. doi:10.1007/s10639-013-9291-7.
  • Jovanovic, Jelena and Vladan Devedzic. “Open Badges: Novel Means to Motivate, Scaffold and Recognize Learning.” Technology, Knowledge and Learning 20 (2015): 115-122.
  • Nicholson S. (2015) A RECIPE for Meaningful Gamification. In: Reiners T., Wood L. (eds) Gamification in Education and Business. Springer, Cham
  • Bogost, Ian. (2011). Gamification is bullshit.  (Accessed: 18 December 2017).
  • Aaron Chia Yuan Hung. Adelphi University. A Critique and Defense of Gamification. Journal of Interactive Online Learning Volume 15, Number 1, Summer 2017 ISSN: 1541-4914 57
  • Hartnett, Maggie, Motivation in Online Education 2016. Singapore: Springer Singapore. DOI 10.1007/978-981-10-0700-2. Pages 78-79
  • Abramovich, S., Schunn, C. & Higashi, R.M. Education Tech Research Dev (2013) 61: 217.
  • Viktoria Strunk and James Willis. Digital Badges and Learning Analytics Provide Differentiated Assessment Opportunities. Educause Review. (Accessed: 15 December 2017).
  • Badgecraft, 2017. Open Badges to Motivate Engagement. ‘Eastern Partnership Youth Forum’. (Accessed: 18 December 2017).
  • Rughiniş R., Matei S. (2013) Digital Badges: Signposts and Claims of Achievement. In: Stephanidis C. (eds) HCI International 2013 – Posters’ Extended Abstracts. HCI 2013. Communications in Computer and Information Science, vol 374. Springer, Berlin, Heidelberg
  • GLOVER, Ian and LATIF, Farzana (2013). Investigating perceptions and potential of open badges in formal higher education. In: HERRINGTON, Jan, COUROS, Alec and IRVINE, Valerie, (eds.) Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2013. Chesapeake, VA, AACE, 1398-1402
  • Kerver (2016). Whitepaper on open badges and micro-credentials. Surf.NL.  (Accessed: 20 December 2017).
  • IMS Global. Open Badges v2.0 IMS Candidate Final / Public Draft. (Accessed: 14 December 2017).
  • Bull, Bernard, 2014. “Beware of Badges as Biscuits”. Etale – Education, Innovation, Experimentation. (Accessed: 20 December 2017).
  • Hamson-Utley, Jordan. “TITLE–BADGING MODELS IN FACULTY DEVELOPMENT.” (Accessed: 20 December 2017).
  • Anne Hole University of Sussex, UK. Open Badges: exploring the potential and practicalities of a new way of recognising skills in higher education. Journal of Learning Development in Higher Education ISSN: 1759-667X Special Edition: Digital Technologies, November 2014 (Accessed: 18 December 2017).
  • Belshaw, D. 2017. Open Educational Thinkering. Pearson WTF. (Accessed: 20 December 2017).
  • Google Groups, 2017. Open Badges. (Accessed: 19 December 2017).
  • Belshaw, D. 2015. Open Educational Thinkering.Towards a visual hierarchy of Open Badges. (Accessed: 20 December 2017).

Icons in graphic used courtesy of

Reflections on IDEL

Has it only been twelve weeks since the start of IDEL? It feels like a lifetime ago, yet just a fleeting moment at the very same time. Having come through the other side I feel a heady mix of inspiration, an excitement for the future, a sense that my understanding in key areas has really come along, with a slight tinge of battle-weariness and trepidation heading into the assignment. One last push and all that…

It’s been interesting to reflect back on the last three months, particularly in light of my first blog post. I wasn’t entirely sure what to expect when starting. It’s almost like starting a new job (what are my colleagues (students) going to be like? Have I made the right decision? Am I going stick out like a sore thumb?), but it’s been the right move. The experience has been quite the revelation – it’s certainly one of the better decisions I’ve made in the last few years.

Before I get stuck into the actual content of the course, one of the real highlights has been the course experience as a whole. I’ll be honest, I’d lost my faith a little in digital learning before starting the course. I think this was largely on the back of client requests for rather linear course formats that did little to take advantage of what online proffers, and perhaps I’d been suckered it this. I’m glad to say my eyes are re-opened again to the possibilities, with an increased vigour to explore the boundaries where possible.

There are many reasons why IDEL ‘works’, but for me, it’s because of the following factors:

  • The design of the course. It’s well crafted, with an obvious appreciation for feedback and incremental improvements each semester.
  • Each activity seems to be planned with real precision, particularly with regards to the ‘running order’. There seems to be a reason for everything, and this builds confidence as someone new to this game.
  • It places ownership on the student to deliver, but with very solid support and guidance if you need it.
  • It feels like the course ‘practices what it preaches’. This was my suspicion prior to the course, but I’m glad it’s turned out to be true.
  • I was a little skeptical about the mix of contact points throughout. Different tutors every couple of weeks, with a separate contact for the blog. How can a useful bond be struck up without it feeling like been passed from one to another? I needn’t have been worried. I think the various course tutors obviously work closely together, and there’s a sense that a lot is discussed ‘behind the scenes’. There’s an obvious passion for the course, and a real sense of duty (and respect) towards students on the course.
  • The course is very much one of curation, rather than instruction. I realise many of the papers involved some of the Edinburgh team in one form or another, but think this is as a result of the University’s standing in this field. I’d never really been involved in a course this deeply that in many areas happens ‘outside’ the VLE, and this is one of the key points I’ll take moving forwards.

What could be have been different? One of the key themes throughout has been the push to keep a keen eye on things, so keeping with I think it’s pointing out some areas that were a challenge. Personally, I found some of the initial subjects quite tough, but I think that’s down to my primary commercial, rather than educational history. They were hugely interesting and of value, but the learning curve was a little tough to start with. As the course progressed I think it moved away from the educational angle in part, and as such was more familiar territory.

I do think the forum format could be improved somewhat – this seems a little tired and not optimised for dialogue. I felt it difficult to understand the various threads, and which ones I’d already read. As always Moodle could benefit from a ‘lick of paint’ to make it more visually pleasing, but again I think this lack of polish is actually conducive to the course – it doesn’t distract. And if we think of the VLE as part of the University of Edinburgh, I’d probably guess it fits with some of the more historic nature of the buildings? 😉

After the first sanctuary, there was something of a lull between the students. An activity such as a skype chat could have brought everyone back together and may be worth considering for next semester. I just felt we all returned in dribs and drabs and the network wasn’t as strong as it was in the first half of the course. Surprisingly I felt the sense of community was at its peak mid-way through the course. But I also suspect life catches up with many after a few weeks. After that initial impetus to get going there are demands on our time outside of the course that can’t be put off any longer.

Back to the course itself, and it’s difficult to summarise, simply due to its wide expanse. If I had to try and distill down some of the key messages that have had the biggest impact on me, they’d be:

  • Be wary of anyone talking about a revolutionary technology in education, it’s unlikely to have the impact that’s expected, and in the ways they are expecting
  • The role of teacher may change with the increasing use of digital technologies within education, but this does not make it any less critical
  • A digital learning experience is as much as part of the institution that’s delivering it as any bricks and mortar setting
  • We need to make sure everyone has a voice around the table in terms of technological use and development – at present Silicon Valley and commercial companies perhaps shout the loudest
  • Technology is influencing the direction of education as much as it is a tool (the rhizomatic/constructivist viewpoint)
  • Technology is not neutral. It has historical, societal, political influences to it.
  • Automated need not mean less personal.
  • A critical eye is key to avoid being swept up in the latest fashion, and to maintain focus on what really matters.

These are quite broad, but without regurgitating the entire course, it’s difficult to go into more specific detail.

Regarding my own performance on the course, I wish I had attempted more multi-model blog posts, but I think time restricted this. I’m hoping that in future modules I’ll have more time to explore this, but as always time is the critical factor.

On terms of my professional life, the IDEL experience is having an immediate impact. I’ve been able to use what I’ve learned to help challenge some pre-defined beliefs with clients. It’s also given me additional confidence, largely because the content has helped fill some of the (significant) gaps in my thinking and knowledge. I also use the IDEL experience as a benchmark in many ways, and talk about my three months, and how they could use curation and external tools to good effect, for example.

One of the more surprising outcomes has been attempts to explain terms such as ‘instrumentalism’, ‘constructivism’ and ‘rhizome’ to my other half, it’s always useful to try and articulate these to someone not involved in the subject to see if you really understand the concepts!

Finally, a real plus point is all the doors it’s opened, both in terms of new subject areas to explore, and the connections I’ve made. It’s been eye-opening to say the least, and am looking forward to seeing how this develops in 2018!

“LARCing” about

Catching up on Week 11 after a bout of sickness, I’ve been playing with the LARC tool provided by the University to explore the topic of learning analytics.

It provides different styles of reports (lenient, moderate, or strict) based on metrics typically used within learning analytics (attendance; engagement; social; performance; persona). I dived straight in and picked the extremes around engagement, simply as ‘engagement’ to me seems a particularly woolly metric…

LARC report, w/c 20 Nov 17. Strict, engagement.

LARC report, w/c 20 Nov 17. Lenient, engagement.

The contrast between the two is quite stark. The lenient style seems more human – it’s more encouraging (“your active involvement is really great”) and conversational/personable (“you seemed”… compared with “you were noticeably…”).

Despite both being automated, the lenient style feels less ‘systematic’ than the strict. Does this suggest that humans are more likely to be more lenient and accommodating, or is simply that we associate this type of language less with automated language – so it doesn’t feel more ‘human’, just less ‘computer’? This certainly chimes with insights into the Twitter ‘Teacherbot’ from Bayne S. (2015). This line of human/computer is beginning to be increasingly blurred through the use of artificial intelligence, and how students react to these interactions is of particular personal interest.

I think it’s interesting to think about how one responds to each style. Given my engagement appears to ‘satisfactory’ at a base level, the feedback isn’t necessarily looking to provoke a particular response. However, if my engagement was less than satisfactory, then I’m not sure personally which one would personally provoke a better response and get me into action. I guess it depends whether it’s the ‘carrot or the stick’ that is the better driver for the student.

The examples above make me consider the Course Signals project in more detail, which was discussed in Clow (2013) and Gasevic et al (2015). From my understanding, this project provides tutors with relevant information about their students’ performance, and the tutor decides on the format of the intervention (should it be conducive to make one). The LARC project has gone one step further it seems, in that the style of response has been created. Referring to my initial point about choice of style, in the Course Signals approach ultimately the tutor would make this choice based on their understanding of the student. That’s not to say this couldn’t ultimately be delivered automatically with some increased intelligence – it would just need some A/B testing early on in the student’s interaction with the course to test different forms of feedback, and see what provokes the desired response. Of course, this discovery phase would bring with it significant risks, as they are likely to receive erratic and wide-ranging types of feedback when engagement with the course at its most embryonic.

As a side note, Clow (2013) discusses the development of semantic and more qualitative data aggregation and this being able to put to more meaningful use. Given this, perhaps a logical next step would be to develop the algorithms to understand the register and tone of language used in the blog posts and relay any feedback to the student in a similar style (as a way of increasing engagement).

Going back to the LARC project, I thought it’d be useful to look at attendance, particularly in light of Gasevic et al’s (2015) references to the pitfalls in this.

LARC report, w/c 20 Nov 17. Moderate, attendance.

Gasevic uses three “axioms” to discuss the issues in learning analytics. One of these is agency, in that students have the discretion of choice in how they study. Naturally then, a weakness in analysing attendance, in particular, is going to be in benchmarking, both against the student’s prior history and amongst the cohort as a whole. Naturally, this was done by design by the UoE team, but we were asked to generate LARC reports based on a week when activity was largely done outside of the VLE, namely on Twitter. As such there’s an issue here, in that the tool does not have the context of the week factored into it, and raises questions about the term ‘attendance’ as a whole. Attendance has been extrapolated from the number of ‘logins’ by the student, and the two may not be as compatible as may look on first reflection.

When comparing with the wider group, it’s also easy to point out potential holes across the group. One student may prefer to log in once, download all the materials and digest before interacting on the discussion forums. Another may be more of a ‘lurker’, preferring to interact later in the week, perhaps when other commitments permit.

Ultimately this all starts to come down to context, both from a situational, pedagogical and peer perspective and this is where a teacher can add significant value. I think one of the wider challenges for learning analytics is the aggregation of these personal connections and observations, however, this raises the challenges of bias and neutrality. It seems that learning analytics as indicators can offer significant value, and the extent to which metrics are seen to represent the ‘truth’ needs constant challenging.


  • Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6) pp.683–695.
  • Gasevic, D, Dawson, S & Siemens, G 2015, ‘Let’s not forget: Learning analytics are about learning’ TechTrends, vol 59, no. 1, pp. 64. DOI: 10.1007/s11528-014-0822
  • Bayne S. (2015) Teacherbot: interventions in automated teaching. Teaching in Higher Education. 20(4):455-467)

Reflections on Learning Analytics

Week 11 builds on the previous week’s theme of big data, providing a spotlight on the use of data specifically for learning. Unsurprisingly, there are many links and reference points from topics throughout the rest of IDEL.

The core readings seem to indicate that the field of learning analytics is still very immature, and when compared with the use of other technologies within education, could be considered to be lagging.

It seems, on the whole, learning analytics operate at surface level at present. Gasevic et al (2015) highlight the complexities around learner agency and pedagogical approaches that can provoke glaring holes in the interpretation of any data. Ultimately in any scenario, educational or not, data needs context to have any meaning (and therefore actionable value), and it seems to be falling short in this area at present.

I enjoyed reading about Purdue University’s Course Signals project in the core readings. The intention behind this project seems to empower the teacher, rather than simply ‘measure’ the student. While the positivity around the results should be taken with a pinch of salt in Clow (2013) (indeed Gasevic et al (2015) does proffer further critique of this), it would seem that involving the teacher in the choice of interactions recognises the absence of relationship and emotion that perhaps these analytics struggle to encompass. However, it does appear that the aggregation and understanding of quantitative data that could bridge this gap is improving (Gasevic et al, 2015).

I particularly liked Clow’s (2013) description of learning analytics as a “‘jackdaw’ field of enquiry, in that it uses a fragmented set of tools and methodologies – it could be seen to be using a rather less cohesive approach that would be required. This is certainly one of Gasevic et al’s (2015) key points – that the field needs to be underpinned with a more robust foundation to really allow it to develop in a useful way.

I wonder if the lack of maturity in this field is an implication of the nature of the field. The learning analytics cycle used by Clow (2012) identified that learning analytics are gathered and used after a learning activity or intervention has taken place. As has become even more apparent to me throughout this course, the pace of technological change is significant and rapid and the impacts on education are quite far-reaching.

If technology and tools are being developed, trialled, integrated, ditched and recycled so rapidly, inevitably it must be a challenge to assess with any rigour. Indeed Gasevic et al (2015) highlight the lack of empirical studies available to attempt to interpret this area. It’s interesting to hear in Clow (2013) that the use of proprietary systems impedes’ this too, with the lack of data available. This is particularly pertinent given their prevalence across the educational sector, and in turn impacts the assessments that can be made across domains and contexts (Gasevic et al, 2015).

A pervading theme across IDEL has been the discourse around the educational ‘voice’ in the development and use of technology, for example, Bayne (2015). Quite rightly this academic world wants to scrutinise, assess and challenge, but it seems the pace of change makes this less and less possible to take place.

For me, the spectre of the private sector is raised in Perrotta & Williamson (2016). It argues that that the use of analytics and interventions are “partially involved in the creation of the realities they claim to measure”. The challenge here is the increasing commercial influence taking place in the field of learning analytics. It cites Pearson as an example, as they have the large-scale distribution of their products to gather learner data, the resources to interpret and mine this, as well as the size to influence wider policy-making. Given the rhizomatic nature of the development of learning analytics, it seems to be that there are many reasons to be fearful of this development, particularly as it looks to be self-perpetuating.

Of course, I’m keen to keep in mind that this is one side of the argument, and I’m sure the likes of Pearson see themselves as helping to push things forward. Certainly, there are areas that the commercial world can help ‘lift the lid’ on learner behaviour, and empower teachers to make interventions – I guess the issue is how much those outside the corporation are at the discussion table. The stark truth is that Pearson’s core responsibility, above all, is to its shareholders, not it’s students.

My own personal experience has been at a ‘macro’ level, or what Clow (2015) refers to as ‘Business Intelligence’. As a commercial training provider, we used learner analytics (at a rather shallow level) to understand learner behaviour, and help us understand the product performance. Given the commercial nature of the people around me, however, there was probably an unhealthy bias or interest in how these can be used to improve commercial metrics. I certainly recognised some of the observations raised by Gasevic et al (2015) around data visualisation, and the pitfalls these can cause.

I think given this week’s reading I’ll certainly be more aware of some of the challenges in this area, particularly around providing metrics without some context. There almost needs to be some ‘priming’ of the viewer before they gain access, just to reduce the risk of mis-interpretation. I think I’ll also be keen to trial the use of analytics data to empower tutors, rather than simply automating prompts which has been the norm in the past. Alongside this, providing students with their own ‘performance’ data would be something I’d be keen to explore.

Last week’s discussions on big data raised concern about the skills needed within institutions to use big data, and I would suggest these are not solely limited to the educational world. The same issues occur in the commercial world, and can oft have quite dramatic implications if not used with care and forethought. It seems like if you are a data analyst with an understanding of educational methodologies you would be able to choose your job!


Data Visualisation

This week in IDEL we communication on the course ventured out of the forums and onto Twitter. As we were investigating the concepts of big data, and its role within education, we were challenged to answer some questions around this on the social media platform using the hashtag #mscidel as the identifier.

We then used a visualisation tool to summarise the week’s activity, and this is the result:

To add to this data, there were 54 nodes, and 151 edges as a result of the week’s conversations.

It seems to me there was a handful of vocal participants, indicated by the increased size of the handle. I don’t think this is a huge surprise, the names that crop up seem to be those that have been more active on Twitter over the last few months. Naturally this does raise a question around the choice of tool, naturally those more comfortable with a tool – or those where the tool is more ingrained in their daily activities – perhaps are more likely to involve themselves in the conversation. This could be as much effect as the cause though, it could be argued those more active on Twitter are more active for a reason.

Although there are 54 nodes, I’m actually a little surprised it’s not gone further than this. The visualisation, to me, shows that the conversation was held within a rather tight group. I think this is interesting for a conversation to occur on an open and public platform not to have more interjections, particularly given the alumni of the MSc programme that is likely to have connections with those leading the conversations.

In terms of an experience, it’s certainly been one of the more intriguing activities.

It felt throughout that there was minimal tutor intervention, and this was something that was actively discussed. There was speculation that this could be by design, simply as a way to avoid colouring the tag output and keeping this ‘organic’. More simply, it was also argued that the conversation was free-flowing anyway, and the tutors’ role was to tee up the conversation and leave it to the students, it’s not as if the programme leaders aren’t busy!

All the activities so far on the course have been private. The blogs have been locked down, and directed conversations have been held within closed environments, for example, Minecraft, skype chats, and the forums. Given we’re now at week 10, this is indicative of our increased confidence in the subject area and lack of nerves about being ‘on show’.

Given Twitter became the central place for conversation, I’ve inevitably found myself making comparisons with the forums, and the relative merits with each. Twitter has some particular strengths of note as a conversational tool.

  • It was easier to keep abreast of the conversations happening, given it’s accessible on our mobile devices more readily, and doesn’t require us to log in. I think with the forums I always set down to ‘do some work’, whereas found myself checking and contributing to the conversations more frequently on Twitter. An interesting point to this is that there is an increased incentive to contribute, as you ultimately know this is going to feed into the visualisation. It could be argued that this links into the notion of badges looked at in week 9. Although this is largely used as a reward for activity or completion, it’s another influence on learning behaviour and motivation.
  • Twitter caps posts to 280 characters (it would have been interesting to do this exercise with the until-recent 140 character limit). I think this encourages brevity and more salient points. Of course, the trade-off here is the depth of response and consideration put into it. There’s probably a place for both.
  • It seemed to me that the conversations on brought in more informal articles and blog postings, whereas the forums seem to be more aligned with journals and papers. Perhaps there’s an aspect of formality here, and how the community is expected to behave in each of the mediums. In the forums, it could be seen as more within the ‘virtual walls’ of the university (week 6 and 7 flags here!) so there’s a perception of what’s expected. Outside of these walls, there are less concerns. Perhaps we feel less observed and on display to the ‘guardians’, even if this is completely false.

On the flip side, there are some considerable strengths to the forums.

  • I felt at times on Twitter that it was difficult to ensure I’d seen all the key conversations. Although we were using an understood hashtag, it’s easy to miss this from your posts, and I find Twitter’s inbuilt search facility problematic at times. With the forums, it’s easy to see all the threads and make sure you’ve not missed an important one.
  • Despite our increased confidence as the course has progressed, everyone may not be in the same position within the group. Ultimately Twitter is open, and this could be a discouragement to some. Given the granular options to be involved in Twitter (e.g. ‘liking’ posts), this may provide lurkers with the opportunity to stay involved without contributing. This doesn’t concern me personally, but sometimes that push can help vocalise your thoughts.

I felt the activity also deepened some of my relationships with the peers, and the increased contact was probably well overdue since the Minecraft interactions. For me, Twitter blurs the line between synchronous and asynchronous activity. There were times when some of the conversations were happening in real time. It was also interesting that in this format the timezone factor did seem to come in. Some of the students are based in North America, so you’d often wake to a flurry of tweets on your timeline – I guess I should count myself lucky that I’m in the ‘right’ timezone for the course. It’d be interesting to hear those views from those who had a different experience as a result of this.

Sandra Flynn’s early blog post about the visualisation raised some interesting thoughts about the nature of data, and almost an inherent implication to start comparing or competing. I’ve struggled to find any research into this area, but it’s an interesting notion – once you can start to measure something, does it change the nature of what is being measured?

The visualisation also became a useful social object once these were produced. However, this didn’t spark off the conversations I was expecting it to. This could be a simple case of it being towards the end of the week, but given one of the activities was to blog about this, I suspect many of us (myself included) are using our observations to add to our blog.

It turned out useful to have done the bulk of the core reading early in Week 10 to give the Twitter conversations more meaning. For me, the more pertinent discussion areas during the week revolved around:

Looking forward to exploring this in more detail next week – learning analytics!

Big Data

As IDEL progressed we seem to have moved away from perhaps some of the more theoretical and human topics, towards a more technical focus. Week 9 was focused on digital badges and blockchaining, and week 10 looks at the concept of ‘big data’ and its implications for education.

Like blockchaining, I’m aware of the concept of big data, but have not got to grips on what this means, or what it can do for us. I think there’s a danger some of these initiatives can be seen as simply buzzwords or part of the zeitgeist, without any longevity to them. But it’s apparent by digging into both blockchaining and big data that these are unlikely to be fleeting developments and are likely to underpin many technical changes over the next few years.

Unsurprisingly, and in line with the rest of the course so far, the focus on the readings has been to contrast the opportunities big data provides with some of the pitfalls. There’s also been a focus on some of the blind spots in this area.

So starting with the positive aspects, with Selwyn (2015) highlighting three areas:

  • Increased ability to use data to measure goals, targets, benchmarks, performance indicators etc
  • An ability to harmonise and standardise across borders, whether these be institutional or geographic
  • To provide a basis for an infrastructure for education to be understood and organised

But naturally, there are some challenges. Eynon (2013) puts the spotlight on three aspects for concern:

  • The ethics behind the sourcing, mining, interpretation and ultimately use of the data
  • The scope of the data, what can be measured as a results, and the questions it can (and can’t) help us answer
  • Inequalities linked to the sourcing and accessing of any data.

The area of ethics is one discussed in detail across several of the recommended readings, as it seems to be a grave cause for concern. A wonderful example is given by Williamson (2015) on use of data provided by Facebook, and the criticism afterward about the permission (or lack thereof) around the use of the data (the defence being that it was already in the ‘public realm’).

To provide an example related to university admissions, at present applications are (largely) based on Academic results at an undergraduate level. However ‘big data’ could provide the ability to forecast degree completion, and perhaps future earning potential, and even link this to social backgrounds and family history. Naturally, on the one hand, this could be empowering – providing institutes with more insight into how to support their students to succeed. The rather dystopian view is that this could prejudice entry requirements, and given some of the metrics being forced upon universities (e.g. ‘satisfaction ratings) and the subsequent ‘slap on the wrists’ as a result of this, it doesn’t seem impossible that big data will be used in this way at some point.

Selwyn (2015) extrapolates these issues further to explore what they may mean for institutes and their students. He argues that this increase in performance metrics may create an “intensification of managerialism within education”, which suggests a move towards a workplace more typical of a commercial organisation.

This commercial influence is unsurprising, given the history of big data. It seems to me that using the processes and themes of big data in education comes with it with ‘baggage’, because of its very design. Because of the commercial background too, it’s important to critique the strength and role of the educational or academic voice within technical developments such as this. This harks back to one of IDEL’s earliest topics, about the role of technology in education, and who is at the table when it comes to the discussions and implementation of this. This seems to be another example of where education could be perceived as a recipient of this technology, rather than helping to shape it.

This is of particular concern when you read about Pearson’s developments in Williamson (2015). Some of the criticism of Pearson could be seen as a reaction against their developments, but when Williamson argues that Pearson could be using big data to create new “models of cognitive development and learner progression”, then this would be a major red flag. Pearson’s main responsibility is to their shareholders, to their students, so it’s important that the progression of technical initiatives like big data is not left to commercial educational companies to drive.

A common issue picked up by Eynon (2013), Selwyn (2015) and WIlliamson (2015), is that of an institution’s capability, or more specifically the personnel within it, to use big data. ‘Use’ in this context, is quite a broad term, from sourcing and mining the data, through to combining with different sources and interpreting it. Williamson argues that there are “several competencies for education data science”, and that there is a significant deficit in the numbers of those equipped with the necessary skills. The skills are a blend of the technical (computational and statistical skills), the educational and an understanding of the ethical and social concerns in this area. As such, Williamson argues that educational data science is very much a field in its own right, rather than an appendage to statistical analysis. Naturally, if this is an area that is significantly under-resourced, then this reduces the impact education can have in shaping big data.

This may also be more difficult to fix than Eynon envisages. The demand for talent – given the nature of the role – is spread across both commercial and educational organisations, meaning commercial companies may be able to outbid educational institutes for their services. It may be one thing to recognise the issue, but fixing it may be increasingly difficult.

I picked up on several themes across the papers that have been discussed earlier in IDEL.

Given the rise of commercial influence in this area (in particular), there seemed to be a ‘call to action’ to the wider educational crowd to become more vocal, and come more centre-stage in these discussions. Selwyn (2013) argues that “the opportunity now exists for educational research to develop nuanced approaches to understanding, and then offering alternatives to, the dominant data conditions that are being established across educational contexts”. This reminded me of Biesta (2013), in his call for teachers to teach, and Bayne (2015) to ensure academia has a role to play in wider technological developments.

Biesta’s references to a neo-liberalistic agenda also pop up in Selwyn – “expanded access to data allows institutions and individuals to operate more efficiently, effectively and equitably”, and Eynon also references themes of efficiency and cost-effectiveness in big data.

Selwyn (2013) also uses the metaphor of water in his discussions around big data. ‘Deluge’, ‘flow’, and ‘flood’ are terms used, and I think this possibly inevitable. The comparisons between data and water are natural – is it aplenty, can travel at speed (rivers) or not (lakes), comes from many different sources and directions, and requires real skill to manage. It’s also a fundamental of life, and you can argue data is the bedrock of economies now (it’s even been termed as more valuable than oil). The dystopian view is that it can also be a dangerous force of huge power, and like recent devastating floods all over the planet, can pose an immediate danger to us through years of mismanagement.

I thought it was interesting that Selwyn (2015) points out that the sociological approach to data is to assume that there are already some inherent issues with it. This admission of lack of neutrality is quite refreshing, and makes a lot of sense. It’s a battle that’s difficult to fight – it’s probably a better use of time to acknowledge this and work out how to deal with it than try and fix at source. The rhizomatic metaphor is also apparent here, in that Selwyn argues that “this approach is careful to acknowledge that data are profoundly shaping of, as well as shaped by, social interests”.

As a final thought, I liked this quote from Eynon (2013) – “We must not get seduced by Big Data”. I think if you were to replace ‘Big Data’ with ‘technology’, you’ve probably got the core theme of IDEL in a nutshell.


Blockchain and Digital Badges

Building on the paper by Halavais (2012), I’ve been reading around the subjects of blockchain and badges, as prompted by this week’s activities.

It’s taken some wrestling to get my head around the concept of blockchain. Although I’m well aware of Bitcoin, the underpinning technology has been explained to me in the past and it’s made little sense. But having read the recommended readings such as this, and this, I feel I’m getting to grips with it. As far as I can tell, it provides a permanent, incremental record of data, which as a result helps establishes a sense of ‘trust’. Naturally then, it’s been linked to the concept of badges within education, as this could have an obvious benefit in establishing and ratifying any credential, and removing any ambiguity about authenticity.

However, having been exposed to the concept of open badges previously (as mentioned here), I still have some fundamental questions around this that I cannot seem to resolve. They key one being, if anyone can create and issue a badge, how can a sense of ‘value’ by attributed to this? Halavais (2012) argues that for a badge “to carry social currency they must represent a significant sacrifice”. As such, how do we define a common understanding of ‘sacrifice’ and attribute to this in a consistent way? It could be a unit of time, for example, 30 hours of ‘learning’ – but what does hour mean in this way? WHat the hell does learning mean in this context? I think without a common measure, it becomes impossible to compare. What’s better, a table or cheese?

To gauge a sense of sacrifice, you may have to resort to type, and align it to traditional methods of verification. For example, upon viewing a badge you may make a judgement on the issuing authority (e.g. a university) and the level of qualification (e.g. Masters). If this is the case, what’s the purpose of an ‘open’ badge?

A badge is intended as a quick visual reference (initially) that is symbolic of something – if this is not easily understand by quick review, it fails to be a badge, it’s just an image. A joke isn’t funny if you have to explain it – it just becomes a statement. A joke is also only found funny by an audience receptive to this, and aware of the context that forms the joke. If an open badge can literally be from any sphere, the context may only be apparent through luck, not by design.

I also think when many things are ‘open’, that this often results in an overwhelming increase of instances of the thing that is open.This can mean it’s harder to actually wade through the ocean of information. Youtube, for example, has provided an open platform for anyone to display their creations, say for example music creations. But while a good musician can come from anywhere, it does not mean anyone can be a good musician, and the wealth of music on here can simply make an unpolished diamond ever harder to unearth.

It’s interesting to explore this in terms of my own professional history. I spent many years working within the TEFL training industry (Teaching English as a Foreign Language). This is notorious for being a tad ‘wild west’ – there’s no overarching accrediting body for TEFL, meaning that in essence anyone can set up a TEFL training school and start issuing certificates. This is driven by the fact that some/many schools overseas are simply happy to hire based on native-English speaking abilities, rather than any teaching competency. This is changing over time as the market matures and there is a realisation that nativeness does not equal good teacher, but is still very much the case in many areas around the world. I bring this up as this has definite links, in that there’s no authority within this sphere. It’s as almost as anyone can issue a badge – an open badge you could say. The same questions persist – you can validate that they have achieved that badge, but what does the badge mean? Sure, there’s metadata that can give you a further insight, but does this metadata provide any real qualitative information?

Back to blockchain, and more specifically the development of Blockcerts. Blockcerts are blockchain-based credentials. Belshaw makes an interesting comment that, in his view “blockchain-based credentials are good… in high stakes situations”, and that he’d “be happy for my doctoral certificate, for example, to be on the blockchain.”

Aligning this to the social classes discussed by Halavais (2012), we can assign the stewardship approach, or governance class towards the blockcerts initative, and the commercial class, or stewardship approach to the ‘weird and wonderful’, as he puts it.

I think the irony here, is openness and blockchain actually strengthening the governing approach, as ultimately this has the traditional authority to issue credentials, and still relies on this to provide any weight.

The only way around this (in my opinion) is to focus on the end goal. It’s not about the credential, it’s about how they perform in the classroom – ultimately how competent they are the profession. Obviously, this is also open to its own judgements, bias, and hidden agendas.

Widen this further, and it links to the Edublocks concept outlined in Audrey Watters’ guide to blockchain in education. In essence, Edublocks link employment (and more tangibly pay) to educational history. This seems to be the natural progression, but even for me, as someone who is probably overly pragmatic about things, this rings some loud alarm bells. The key one being that by tying everything back to employability, what happened to the simple pleasure of wanting to learn about something, simply for the curiosity and intrigue to know more? Perhaps there are links here to the neo-liberalistic influence (Biesta 2012) permeating across education, it certainly seems to be another example of this.


Reflections on badges

I think we’ve all got our own personal history with badges. Somewhere tucked up in the loft is a scouts sweater stitched with a heap of merit badges, and probably in the same box is a Blue Peter badge for a competition entry.

Online, it’s difficult not to notice the Brexit situation creating a plethora of both pro-and anti-EU imagery associated with twitter accounts. A more subtle example may be the triple brackets also appearing around twitter handles. Although these were initially implemented by users looking to subvert a disagreeable Google plug-in, they’ve now become as much a sign of solidarity and defiance as doing anything tangible.

I’ve also encountered badges in my professional life. Several of our clients at Candle are interested in providing students with access to digital badges upon completion, as a way of adding additional value to their training products and to help create unwitting ambassadors for their brands. A few year ago I was also lucky enough to present at an elearning event, where Doug Belshaw, who led the Mozilla Open Badges project, was also speaking.

Alexander M.C. Halavais’ paper on the ‘genealogy of badges’ provides a detailed insight into the history and nature of badges, which should aid the direction of travel in this area. In my experience it’s still quite a woolly topic ‘on the ground’. Many people in my network would have been exposed to the idea, but from my experience I don’t think there’s a consensus on why they should be used, and what they should be used for.

My key takeaways (there’s a few…):

  • The concept of badges goes way back, with many early examples military and religious roots
  • Badges can be used for a variety of purposes. They can:
    • show support for a cause (a ‘button’)
    • Be used as a way of shaping a persona, either intentionally or unintentionally
      display membership of a group (both for positive and negative reasons e.g. star of David)
    • indicate status within a group e.g. moderator, or new member
    • display achievements (and provide a route to more detail on this)
    • display granularity or progression of expertise
    • be used as a motivational tool and incentivise progression
  • Badges can often be a combination of many of these factors. This mix can have positive and negative consequences
  • There are different types of badges
    • Badges of honor, authority and privilege
    • Badges of achievement, qualification, and experience
    • Badges of experience and expression (the most common sort of badge found on the web)
    • Badges of survival
    • Boundary badges and monstrous badges
    • Learning badges
  • Badges can be acquired through different means: a demonstrated skill; experience; upon recognition by peers; awarded by a regulatory body; upon achievement of milestones (e.g. number of posts)
  • Badges could be used as a form of compensation for sacrifice, in lieu of other forms of compensation e.g. monetary
  • Badges have primary value in a community where they are earned and displayed. Transferrable value minimal, although attempts are being made to bridge this gap (e.g. Mozilla).
  • Badges are more than just ‘skin deep’ in terms of the symbolism and the dynamics of symbolism behind them. In a sense badges could be considered to have a constructivist make-up. (There are some echoes here of the ‘Medium is the message’ video referenced earlier in the course.)
  • In an online content, badges have prevalence in communities and environments where flat hierarchy is pursued, yet echo less flat structures. There seems to be a dichotomy at play here, and there are obvious links to the critique of OER in Bayne, Ross, Knox (2015).
  • Trust in the what the badge is conveying is very much at the core of its success as a symbol.
  • The perceived value in a badge is tied into the effort and sacrifice used to attain them.
  • Badges with perceived high value are understandably more likely to be ‘faked’. A clear example of this is knock-off fashion items – it’s not the materials that are important per se, it’s the logo/badge that appears on them.

Obviously this is not a small summary, but I think this reflects the real depth of thought the author has provided. I think the term ‘badge’ is obviously quite a broad term, in that it can they can be applied in a many ways, in many contexts, for many different purposes.

My own reflection is that the paper primarily focusses on badges earned through activity. This may be as a result of the author’s interest, or a reflection of the situation at the time of writing (2011), and I’d say it’s probably the latter given the quote “but so far the mechanisms for verifying such badges do not exist.”

The use of badges has strong links to the that of identity and community, topics discussed early on in IDEL. They are very much a manifestation of aspects of this, and influence the nature of community and the social interactions within this.

Professionally, we’ve started to see increasing demand in the use of badges for digital credentialling. It seems there are plenty of vendors developing solutions for this market, such as credly and accredible. I think the rise in this area taps into increasing demand to build personal brand (and trust/proof in this) and is a viable solution to this as:

  • Meta-data allows increased visibility and depth into the achievement. It can also keep it ‘valid’, in that through some vendors the awarding body had the power to invalidate credentials
  • For the awarding body, it allows them to take advantage of the inherent viral nature of the web, and put their brand in front of the peers of students (e.g. Linkedin connections).

I think for these badges to work, there needs to be a common understanding that this meta data exists, how to access it, what it represents, and wider digital literacy around what makes a digital badge ‘reputable’.

I’ve also seen badges been used as a way of incentivising progression, as the paper highlights. I’ve seen significant levels of cynicism around this – I think as it’s seen as treating students as dumb, and being easy to motivate/manipulate. Perhaps given it’s a concept that’s come over primarily from gaming, it’s seen as applying something from a non-serious situation, to a serious one, particularly when one’s facebook stream seems rampant with easy-to-attain candy crush badges. The paper argues that the social currency of a badge is linked to the sacrifice, which ties in with this.

In the author’s reference to the origins of the ‘badge’, and the uses of these within the military, badges were used as a way of fostering community and establishing commonalities. But alongside this, they were also used as a way of ‘command and control’. They provided an easier way to identify a group of people within battle, and manage them accordingly. Bringing this to the current day, are badges (in their more abstract form) being used in this way now, and are they be exploited? Technology allows aggregation on a mass scale, so there could be ways to scrape the web, and corral groups of badge wearers accordingly. By displaying their ‘values’ through an icon, are badge wearers unknowingly categorising themselves and allowing them to be targetted as part of a group? Given the current prominence of Russian bots in the news, I can’t help but wonder if this self-labelling by web users is being used as a way of both targetting and infiltrating these groups.

One area I haven’t touched on in this post, is the rather intriguing notion of guardian and commercial classes. This seems to have wider repercussions that I’d like to dwell on a little. There’s links here (again) to Biesta’s view on the learnification of education, and the roles teachers play within this. These societal and political influences really penetrates all aspects of education, so it shouldn’t be a surprise to me that this affects the tangible aspects, such as badges, but it does. I need to think this through and follow-up with a more considered post…


  • Alexander M.C. Halavais (2012) A GENEALOGY OF BADGES, Information, Communication & Society, 15:3, 354-373, DOI: 10.1080/1369118X.2011.641992
  • Bayne, S., Knox, J., Ross, J. (2015). Open Education: the need for a critical approach.Learning, Media and Technology Special Issue: Critical Approaches to Open Education. 40(3). pp. 247-250.
  • Biesta, G. (2012) Giving teaching back to education: responding to the disappearance of the teacher. Phenomenology & Practice. 6(2), 35-49. journal article

MOOC Reflections

As prompted by the IDEL course, I decided to tackle the MOOC titled ‘Religion and Conflict’ on the FutureLearn platform. MOOCs aren’t a particularly new experience for me, like most participants, I’ve started (and not completed) several.

I initially contemplated starting a MOOC related to the current IDEL studies (e.g. there’s a course currently on offer from the University of Leeds on Blended Learning), but felt a change of subject might do more good, and perhaps be able to detach the format from the subject with greater ease.

So how open is a ‘Massive Open Online Course’? Well as Bayne, Knox, Ross (2015) argues, in some respects it is very open, in others it’s less so. They argue that the premise of open in digital education is largely based on three factors – geography, hierarchy and financial.

The MOOC I’ve taken certainly ticks the box in terms of geography. The course is provided by the University of Groningen, who I would never normally get to study with. But it’s important to keep a critical eye on this, as there is little sense that I am studying with the University, or get a sense of Groningen. In weeks 6 and 7 of IDEL we looked at what it meant to study at the University of Edinburgh, and on reflection, using the UoE VLE as the backbone for the course was actually part of the identity in here. With the MOOC, the course is ‘provided’ by the University of Groningen, but I don’t feel like it is ‘delivered’ by them. The format is a fairly regimented experience, rolled out across many institutions (understandably this has to have commonalities to make it ‘FutureLearn’). So how open is this? From my perspective, it’s given me access to expertise from a university, but I’m not accessing the experience of the university.

Regarding institutional structures, this is certainly more ‘horizontal’ than through a traditional university experience. But at what cost? By delivering to the masses, and at scale, then it could be more ‘vanilla’ in terms of the curricula and content.

From a financial perspective, the MOOC is free to participate in, but in my opinion, this is as much a blessing as a curse. There’s no ‘skin in the game’, it’s as easy to leave the course as it is to join, and this lack of commitment does affect the overall course experience and ultimately value gained. IDEL works particularly well because of the social interaction between the students, and the role the tutors play within this dialogue. As much as the conversations are instigated by the tutors, the cost and time commitment required on IDEL do push the student to fully take part. Unlike a MOOC you are also aware the MSc is a long-term commitment, so the reward for building these connections is felt over many years (and beyond).

Given the removal of these barriers, how has this impacted the take-up of online educational opportunities? Not much by all accounts. Sandra Flynn posted a link to an insightful study (Emanuel, E.J., 2013) on the forums, demonstrating that those taking the opportunities presented by MOOCs are largely those already degree-qualified. It would suggest that there could be awareness, desirability, and curriculum issues around open educational opportunities at play. If the ambition of MOOCs is to increase educational opportunities for the masses, then perhaps it is failing in this respect.

On the flipside, many of the inadequacies highlighted in Bayne, Knox, Ross (2015) manifest itself in my experiences of the MOOC so far.

Self-direction and motivation is a key oversight picked out (and to reaffirm a previous blog post, I love the phrase “imagined autonomous subject”), and this is plain to see. Already the frequency of student-generated content (largely comments) has significantly dropped from week 1. As much as this is an issue of commitment, and perhaps value, it’s also (in my opinion) down to the very nature of the MOOC. Community aspects are a core activity of the MOOC (I’m sure the course designers are aware that this is a central part of a successful learning experience), but could also be viewed as quite peripheral. I think the latter is ultimately down to the open access, it’s simply very difficult to forge any deep interactions of real value with such large volumes. Ironically this scale puts people off continuing the course, and then the volume of participation drops to a level where the community could actually be of real value! I’m sure there are many factors in this lack of motivation after the initial sign-up, and one could be as simple as ‘life getting in the way’.

Another concern raised in the critique is the role of a teacher within the MOOC. My experience so far is that the role of the teacher is largely that of content provider, as there is little or no interaction in the communication channels. There also doesn’t seem to be an obvious route to contact them. This is understandable, given the sheer scale of enquiries. The explicit nature of an open course seems to be having a detrimental impact on the course itself.

Finally, there is a sense of isolation amongst students. This ties into this community feel and is driven by the quite limited opportunities to connect with each other On this course, the other participants are simply a list of names. There are no opportunities to connect outside of the FutureLearn platform or the prescribed activities.

There are early signs that students are simply there to ‘complete’ rather than interact. For example, it’s useful to note the lack of replies in the introductions. This could be for several reasons, but a suggested factor could be the sheer volume (there are currently 12 pages of introductions!). As a student myself, there is little incentive to reply if there’s no chance of fostering relationships or keeping track of people you’ve noted. Harking back to an earlier part of IDEL, it’s interesting to note that no rich media is used by students, it’s primarily the written word. As we know, the written word does not have the ‘thick’ descriptions (REF) that an image of video can provide. It would be natural to draw comparisons with IDEL, when our introductions were videos and images using external tools. This visualisation ‘embodied’ the student and humanised our peers, which I think plays a factor in driving relationships.

I thought it was interesting to observe that as a student it is possible to comment on specific elements, rather than discuss as a group. The videos and entries on specific pages could be considered the social objects, rather than the broader themes perhaps. This doesn’t particularly facilitate an ‘open conversation’.

Image: Many discussion posts have no replies.

Another aspect to consider in ‘openness’ is the overall course journey. In the MOOC some activities are locked, it’s very much a fixed pathway. Course outcomes are fixed and defined – it’s very prescriptive in terms of what you can gain. In comparison with IDEL, while there is a strong sense of direction, there is the opportunity to meander and align with your own personal goals. There is also the sense of flexibility, for example, the additional Minecraft session that was discussed. It’s perhaps not fair to compare the two courses, but I think it’s useful to highlight the contrasts.

In summary, I think it can be argued that MOOCs remove some of the barriers in place, as highlighted by Bayne, Knox, Ross (2015). However the very format of open impacts the course delivery, which in turn throttles the value that can be gained.

Access is open, but once part of it, the experience can be quite closed. Compare this with IDEL, where access is significantly more restricted, but once part of it, the experience is very open.

Overall, I do think it’s important to acknowledge that the MOOC can be a good thing. They do open up many doors and provide access to knowledge and expertise that could be seen as less accessible. I think if they were repositioned however as ‘tasters’ and a way of exploring interest in new topics, this may help generate more interest. Perhaps it would also drive changes in the format that does not seem to motivate the students.

In terms of taking these reflections into my professional life, I think it only reaffirms some of the thoughts I’ve had throughout this course. Mainly that a synchronous course, with a ‘social presence’ at the heart of it, and key involvement from a course leader leads to a much more fruitful experience for all involved. I think the MOOC observations add weight to the understanding that a commitment is key (this doesn’t necessarily have to be financial), and that less is perhaps more in terms of cohort size.


  • Bayne, S., Knox, J., Ross, J. (2015). Open Education: the need for a critical approach. Learning, Media and Technology Special Issue: Critical Approaches to Open Education. 40(3). pp. 247-250.
  • Emanuel, E.J., 2013. Online education: MOOCs taken by educated few. Nature, 503(7476), pp.342-342.

Minds on Fire!

The IDEL course so far has been excellent at instigating a critical view on things. Given this is the first module of a potential Masters, I think the team has crafted an experience that’s insightful and thought-provoking, whilst also creating a firm foundation for the rest of the course. Understandably then, several of the recommended readings so far have focussed on critiques on popular thinking within digital education.

So I’ll admit it was quite refreshing to read Brown, J. S., & Adler, R. P. (2008). This paper seems to focus on the positive aspects and opportunities of open education. I think that while it’s important not to get lost in the hyperbole, there are many upsides of open education and the increased opportunities technology. I see this as a ‘nice problem to have’, and we are lucky to be able to involved in these discussions at such a critical time in their development.

With the paper, there are some signs that it’s showing it’s age in parts. However, hindsight is a wonderful thing, and at the time of writing, I’m sure it was difficult to foresee how trends would develop.

One example is the talk of moving from a single career to one of multiple careers. In the increasing gig-economy, there are suggestions that it’ll actually be one of multiple and concurrent careers. I’d certainly describe myself as falling in this category. The point of this in terms of education is that as workers in this economy we’ll need to “acquire new knowledge and skills on an almost continuous basis”, and if anything this could be on a greater scale than imagined by the authors.

I think this illustration from Heather McGowan sums this up rather well visually:

I was also struck on the author’s views of open education, and how these seem to fall victim to some of the inadequacies described by Bayne, Knox, Ross (2015). For example, it seems to suggest that access to resources has been the primary issue and that unlocking this will help solve the problem of increasing educational needs. It also makes the assumptions about self-directed learners that Bayne, Knox, Ross (2015) are concerned about – that they have the skills, motivation and self-guidance to conduct this upskilling largely by themselves.

I found it interesting how social learning was discussed in the paper. I think it makes sense as a topic at the time when social technologies were becoming more commonplace and useable yet find it a little odd that this has to be explained, particularly when the paper is aimed at an academic audience. Just goes to show perhaps how technology had yet to develop to facilitate many of the activities that were taking place in traditional classroom-based learning. Perhaps this is also indicative of a lack of involvement at this stage in educational technology development from those in education?

‘Learning to be’ is not something I had previously seen phrased in this way, and it’s an interesting choice of words. I like how it links in connection to the existing community within this, rather than just about acquiring the skills and expertise associated with that area. It acknowledges the importance of the social connections with the industry of choice, and having these in place forms a critical part of the development process. I’ve heard this referred to recently as the ‘personal learning network’. You can start to see concepts like Charles Jennings’ 70:20:10 rule being borne from papers such as this.

This aspect of community made me reflect on my own involvement in new circles on the back of the IDEL so far. The course certainly feels very far away from the more traditional teacher/student model. To me, it feels like there are circles around the MScDE team, and we’re almost being brought into those circles gradually. This is done gradually in a way to avoid overwhelming us, but also to maintain the integrity of the circles – it’s only as valuable as the contributions made within it. With fellow students, I feel I’ve had touchpoints into their world, but I wouldn’t say there is a IDEL students ‘circle’ (for want of a better word), I think it’s more becoming part of the wider existing one.

Bayne, Knox, Ross (2015) talk about hierarchies being one of the three pillars that ‘open’ attempts to knock down. Given this, it’s pertinent to note the talk of “trusted individuals” and “administrators” within environments in the paper, for example, wikipedia. It could be argued that one hierarchy has simply been replaced by another, although the perception of this within these circles may be different. It begs the question, what does open actually mean? In terms of hierarchies, is this some sense of democratic choice and selection? From the outside looking in, I’m not sure I’d prescribe ‘democratic’ to Wikipedia’s choice of administration. In this sense does open, simply mean ‘crowd-sourced’?

This paper has lots of links to workplace training that I’m currently involved in. It may be that the paper is coming from similar angles, however, it could be that thinking in this area is very much in 2008! I particularly liked the quote:

“We now need a new approach to learning—one characterized by a demandpull rather than the traditional supply-push mode of building up an inventory of knowledge in students’ heads”.

I heard this recently summarised quite neatly recently as “Resources, not courses.” The challenge we have commercially on this for my business is within company cultures, as it often comes to budget choices. Specifically, there are often separate budgets for ‘training’, whereas providing a bank of resources relevant to that organisation that staff could access when required would be difficult to gain funding for. A way of thinking we’re trying to change slowly!