Findings from a Case Study on Refugees Using MOOCs to (Re)enter Higher Education
Lancaster University, University of Birmingham & Art of E-learning (United Kingdom)
Abstract
This paper presents a case study evaluating the online learning experience of ten refugees on MOOCs. Qualitative data were collected from the learners, generating a set of 43 statements depicting the learners’ experience of learning, which were analysed using an augmented Community of Inquiry (CoI) theoretical framework. The key findings show that learners particularly desired teaching presence in terms of facilitation and feedback on their progress; they viewed online social presence as being important but generally not well managed in MOOCs; and they expressed cognitive presence mainly in terms of the selection and use of information sources. Learning presence (the additional element of the “augmented” CoI framework) was described primarily in terms of the importance of goal-setting and planning. The implications for organisations supporting refugees and other learners in disadvantaged circumstances on MOOCs are discussed.
Key Words: MOOCs, refugees, Community of Inquiry, developmental evaluation, Kiron
Reception date: 28 July 2018 • Acceptance date: 14 October 2018
Introduction
This paper examines the learning experience of refugees and asylum seekers (who are included in the term “refugees” in this paper for brevity) within the context of a German non-governmental organisation, Kiron Open Higher Education gGmbH, which supports refugees to learn from massive, open, online courses (MOOCs). Kiron has designed learning pathways for refugees, with MOOC curriculum outcomes mapped onto typical learning outcomes of German higher education institution programmes. The aim is for those refugees who obtain credits from MOOCs to have those credits recognised by higher education institutions as being equivalent to their first semester of study, allowing Kiron learners to go straight into the second year of their degree programmes-assuming they also meet other entry requirements specified by the respective institutions (Suter & Rampelt, 2017). In addition to an online learning platform through which learners are directed to MOOCs in their subject of choice, Kiron offers support through the provision of volunteer study buddies and mentors, online tutorials, online study groups, and occasional face-to-face “study weekends”. The aim of this study was to help Kiron, and other organisations that support refugees and other MOOC learners in disadvantaged circumstances, to develop systems and strategies for effective learner support.
Context of the Study
Kiron is a social change organisation that acts as an intermediary between refugees and higher education institutions in Germany. The ecosystem within which Kiron operates is complex in at least three ways. Firstly, their recommended MOOCs are drawn from institutions around the globe, and these institutions are themselves bound by contracts with platforms such as Coursera and edX. Secondly, the availability of local volunteer tutors, buddies and mentors that Kiron can draw on to support learners varies considerably according to region and discipline. Thirdly, the German universities that will ultimately accept Kiron learners operate within different regional and institutional policies. For these reasons, a developmental evaluation (Scriven, 1996; Saunders, Charlier & Bonamy, 2005) was chosen, as opposed to a formative or summative evaluation focusing on measuring outcomes or impact. One of the key characteristics of developmental evaluation is “double-loop learning (learning how to learn about the nature of the problem and situation)” (Patton 2015, p. 302). In this study, the “object” of the evaluation comprised the learners’ experience of those elements provided by Kiron, such as carefully curated and sequenced MOOCs, volunteers, and recommended free digital resources (e.g. language learning apps), as well as other MOOCs, resources (such as YouTube videos) and supporters external to Kiron.
Literature Review
Refugees and MOOCs
While some commentators have noted that MOOCs are generally most suited to relatively privileged learners living in well-resourced environments (Carlsen, Holmberg, Neghina, & Owusu-Boampong, 2016; Nti, 2015; Rambe & Moeti, 2017; van de Oudeweetering & Agirdag, 2018), the potential for MOOCs as a learning tool for refugees has been noted (Aydin, 2017; Bossu & Stagg, 2018). There is an emerging body of literature on migrants and open, online higher education. For example, the MOOCS4inclusion project report indicates that there is “a plethora of new FDL (free digital learning) initiatives for migrants and refugees that vary in nature, design and purpose” (Colucci et al., 2017, p. 99), and notes that the cases where MOOCs were found to be most effective tended to be “targeted, blended and facilitated” (Colucci, Muñoz & Devaux, 2017, p. 101). Moser-Mercer (2014), in her description of how she supported two refugees using a MOOC in the Dadaab refugee camp in Kenya, noted, not surprisingly, that there were significant technological obstacles for the learners. She also found that her role as a remote mentor to the learners was crucial: she had registered on the course as a learner herself, and attempted to anticipate challenges of a cultural, linguistic or technological nature that might arise for the learners so that she could intervene and support them in good time. She communicated regularly with the learners via email throughout the MOOC, and the learners noted her regular “presence” as an essential element in their motivation to complete the course (Moser-Mercer, 2014). Crea’s (2015) report on a four-year higher education pilot in refugee camps in Africa, which included the use of MOOCs, emphasises the need for cultural and linguistic translation of resources for learners in (and from) developing countries. The same point is confirmed elsewhere in the literature (e.g. Nkuyubwatsi, 2014; Moser-Mercer, Hayba & Goldsmith, 2016; Bozkurt, Yazici & Aydin, 2018).
Community of Inquiry Framework
The Community of Inquiry (CoI) framework for online learning was developed by Garrison, Anderson and Archer (1999). It comprises three interdependent dimensions in a process model for learning and teaching in a community: teaching presence, social presence and cognitive presence. These dimensions reflect the distributed teaching and learning responsibilities of all participants, with no strict role boundary between learners and teachers. A quantitative CoI survey instrument was published in 2008, using the three overarching presences and subcategories derived from the authors’ earlier publications: teaching presence was divided into design and organisation, facilitation, and direct instruction; social presence was comprised of interpersonal relationships, open communication and group cohesion; and cognitive presence was divided into four phases – triggering event, exploration, integration and resolution (Arbaugh et al., 2008; Garrison, 2017). These categories were further described in terms of 34 indicators. This instrument intensified the use of the CoI framework in the literature.
In a study which examined publications from 2009-2013 in seven leading online and distance learning journals, the CoI framework was found to be the most frequently used theoretical perspective (Bozkurt et al., 2015). In other studies, the framework has been shown to predict students’ perceived learning and their satisfaction in online learning (Akyol & Garrison, 2008), and to predict learning outcomes (Garrison, Cleveland-Innes & Fung, 2010). It has also been used to compare students at different types of higher education institution (Moreira, Ferreira & Almeida, 2013), and as a heuristic for learning design (e.g., Dolan, Kain, Reilly & Bansal, 2017; Amemado & Manca, 2017). While the framework was originally designed for use in the context of asynchronous online learning, it has also been found useful for analysis of synchronous video communication in education (Themelis, 2014). The CoI categories have been found to align closely with recommendations for online teaching in professional education (Dunlap & Lowenthal, 2018), and the framework has been proposed as a model for conceptualising professional training for people in developing countries (Murugesan, Nobes & Wild, 2017). Two recent, large-scale studies have focused on the use of the CoI framework in MOOCs: Cohen and Holstein (2018) showed that MOOC learners attributed the success of certain MOOCs to a combination of all three presences; Kovanović et al. (2018) confirmed the validity and reliability of the CoI survey instrument for measuring perceived levels of teaching, social and cognitive presence within MOOCs, but suggested adjustments to the subcategories within the three presences to better account for specific learner perceptions arising out the differences between MOOCs and formal distance programmes – particularly in relation to the large size of learner cohorts and the relatively short duration of courses.
Critiques of the CoI framework have primarily pointed to the lack of its explanatory power for learners’ self-regulation (Broadbent & Poon, 2015; Kuo, Walker, Schroder & Belland, 2014; Shea et al., 2010; Shen, Cho, Tsai & Marra, 2013). Cho, Kim & Choi (2017) found that highly self-regulated learners were more likely to perceive higher teaching, cognitive and social presences than learners with low levels of self-regulation. Shea et al. (2010) and Shea et al. (2012) proposed a fourth presence, “learning presence”, to account for self-regulation, drawing on work by Bandura (1986) and Zimmermann (1999). Shea et al. (2012) identified the following subdivisions for Learning Presence: forethought and planning, monitoring, and strategy use. In keeping with Arbaugh et al.’s (2008) CoI measurement instrument, each subdivision had three to six descriptive indicators. In this paper I refer to the combination of the original three presences and learning presence, with their respective indicators, as the “augmented CoI framework” (See Figure 1). There is emerging research validating this concept (e.g. Pool, Reitsma & van den Berg, 2017). In response, Garrison (2017, p. 31) has warned that a fourth category would complicate the framework, compromising its explanatory power unnecessarily. Instead, he suggests using a “shared metacognition construct” as a way of addressing the identified “gaps” in the CoI framework (Garrison & Akyol, 2015); however, this construct is not yet well developed. A different critique is offered by Jaffer, Govender and Brown (2017), who found in their study of “wrapped MOOCs” in South Africa that questions of structure and agency (Giddens, 1986) could not be accounted for within the CoI theoretical framework. This issue is likely to be of particular relevance in the case of a marginalised group such as refugees.
Figure 1: Augmented Community of Inquiry Framework
The study
Ethical Considerations
As this paper was written as part of a PhD programme, I gained the requisite permission to conduct this research from Lancaster University. I obtained voluntary, informed consent from the participants, using a consent form approved by Kiron staff. For data protection purposes, the research participants were pseudonymised. Following Clark-Kazak (2017, p. 13), I avoided asking research participants for details about their forced migration experience that may have been re-traumatising. Participants were invited to check the transcripts, emphasising their role as research partners rather than “subjects”. The design of the study as a developmental evaluation was aimed at bringing about reciprocal benefits for the communities of participating refugees (Mackenzie, McDowell & Pittaway, 2007). As a further ethical consideration, I am sharing the research process and findings openly, in order to increase opportunities for peer feedback and to improve the visibility of findings (Pitt, de los Arcos, Farrow & Weller, 2016, p.36). To this end, I have published much of the raw data at a website created for this purpose (https://sites.google.com/artofelearning.org/qoolref).
Research Questions
The study was guided by four research questions (RQs):
- RQ1: What are the learners' depictions of how they learn online?
- RQ2: How do these depictions map onto the indicators for teaching presence, social presence and cognitive presence in the CoI framework (Garrison, 2017, p.173-175) and Shea et al.’s (2014) proposed indicators for learning presence?
- RQ3: What can we learn about the application of the enhanced CoI framework to the evaluation of the learning experience of refugees, and potentially also other learners in disadvantaged circumstances?
- RQ4: What are the implications of the findings for organisations supporting refugees, and potentially other learners in disadvantaged circumstances, to learn from MOOCs?
Methodology
The research design was planned using the RUFDATA framework (Saunders, 2000), which I shared in a blog post (Witthaus, 2017). This was a qualitative study, in keeping with the “emerging” and complex nature of a developmental evaluation. I was not commissioned by Kiron to do this study, but carried it out as a volunteer, framing my role as a critical friend rather than as a “neutral”, “external” evaluator. I had initial discussions with four members of Kiron staff to help me establish the research aims, and to determine the likely uses of the evaluation findings. The intention, from Kiron’s point of view, was to find out what aspects of their provision were working well, and whether there were aspects of the support they provided that would benefit from a different approach. It was agreed that I would ask the research participants the following questions:
- What helped you to succeed in learning online?
- How do you know whether you succeeded or not?
- What kinds of challenges did you face, and how did you overcome them?
- What else would have helped you succeed?
The data gathering process began at an on-site “study weekend” in Berlin, in August 2017, where I ran two focus groups with 13 learners. I asked them to write short statements in response to questions 1, 3 and 4 above, which we then discussed. Later, having obtained the learners’ consent, I emailed them all to request an online interview. I also invited my two Kiron “study buddies” to participate. In the email, I repeated the same three questions from the focus groups, and added question 2. Altogether, 11 learners responded. We carried out the interviews in September and October 2017. Unfortunately, one interview was cut short due to connectivity problems, leaving ten complete interviews. Two of the interviews took place mainly in German, and the rest in English with varying amounts of code-switching between the two languages. I therefore used “denaturalised” transcription (Oliver, Serovich & Mason, 2005), focusing on meaning rather than an exact replication of what was said. Personal data were recorded separately and aggregated anonymously to provide a demographic profile of the participants.
After asking the learners to review and edit the transcripts, I imported the transcripts into NVivo, where I carried out a first round of inductive categorical analysis, using open coding (Elo & Kyngäs, 2008). This phase generated codes mainly related to activities and resources that the research participants perceived to be either helpful or not for their learning, and their feelings about online learning. This was followed by a second round of coding involving a deductive categorical analysis, using the CoI survey instrument (Garrison, 2017, p.173-175) to code for teaching presence, social presence and cognitive presence, and the learning presence indicators from Shea et al. (2014, pp. 15-16). In the third round of coding, I reviewed every coded segment and developed a total of 43 statements or “depictions”, following Saunders, Charlier & Bonamy (2005, p. 42), of what learners said had helped or hindered their learning, and what additional support they would like.
Learner Profiles
Five of the ten participants came from Syria, two from Afghanistan, two from Pakistan and one from Uganda. Their ages ranged from 22 to 42, with five in their 20s and four in their 30s. Eight were studying business subjects, and two computer science. At the time of interviewing, three had been accepted into German universities; five were working towards applying in 2018; one was not planning to attend university, preferring to seek professional training opportunities; and one was undecided. Eight were male and two were female, which was an accurate, albeit unfortunate, reflection of the gender balance of Kiron’s learners at the time.
Results: Depictions of Learning
This section addresses the first two research questions:
RQ1: What are the learners’ depictions of how they learn online?
RQ2: How do these depictions map onto the indicators for the presences in the augmented CoI framework?
Research Questions 3 and 4 will be addressed in the discussion section.
Forty-three statements were generated about what had had helped the learners learn, the nature of challenges experienced, and what further support they would like in relation to their online learning. The statements are presented in the tables below, organised around the revised CoI framework headings and subheadings taken from Garrison (2017, pp. 173-175) and Shea et al. (2014). The phrasing of each depiction is an agglomeration of the words used by the research participants, and includes my own paraphrasing, in line with the concept of denaturalised transcription. In those cases where a concept articulated by the participants matched one of the 34 indicators in Garrison’s (2017) survey, Garrison’s wording is used. (There were only four such instances.) Under each table, some of the key quotations from research participants relevant to that presence are presented, numbered according to the depictions they refer to in the table.
Teaching Presence (TP)
As can be seen from Table 1, teaching presence was alluded to in just under half (19) of the depictions. Several learners made comments about the ease or difficulty of making sense of both the Kiron and the MOOC providers’ platforms.:
It helped when… | I was challenged by… | I would like… | |
TP1 Design & organisation | (1) the course system was well organised and easy to make sense of. (2) the course materials were designed to be engaging. (3) the course materials were designed to be supportive. |
(4) the educational offer from Kiron, which I could not make sense of on my own. (5) the platform which was difficult to navigate. (6) courses that were not suitable for my level. (7) course content which was not related to real life. |
|
TP2 Facilitation | (8) I was guided towards understanding course topics. (9) I was kept “engaged and participating in productive dialogue” (Garrison, 2017, p. 173). (10) I was challenged to work things out for myself. (11) I was given support to stay focused on my learning. (12) the “development of a sense of community among participants” (Garrison, 2017, p. 173) was reinforced. (13) I was supported to reflect on my learning process in a structured way. |
(14) the fast pace of delivery. | (15) someone to help me stay on task. (16) to be noticed, valued, and encouraged. |
TP3 Direct instruction | (17) I was given feedback on my strengths, weaknesses, and understanding of the subject matter. | (18) course content that contained open-ended questions with no feedback. (19) course content that assumed prior knowledge that I did not have. |
(1) With Kiron … you have an exact, direct study track, you have organised courses, you can organise your materials, what you want to study, you can take it step by step… you can really plan your target, you know exactly what courses you have to do, what the next step will be when you finish with this course… (Salim)
(4) When I got into Kiron, first of all I didn’t understand anything. The level of complexity was too high for a newcomer, to know how to use the software and how to get into it. It’s about six, seven, eight hours long the process. (Imran)
The novelty of online education was a significant barrier for some learners – although it should be noted that Imran’s comment was typical of learners who had joined Kiron before October 2016, when the platform navigation was improved. Engaging materials were described as those containing animations, cartoons, music and humour. Supportive materials typically included videos with subtitles and supplementary notes.
Many comments were made about the ways in which learners felt guided towards an understanding of subject matter. The role of “facilitator” was distributed between tutors, peers, mentors and friends, and facilitation took place in various ways:
(8) [Interviewer: So this guy in Thailand, is he mentoring you online?] Online. Just asking questions about what I’m doing, and wants to make sure that I’m doing well. He’s very old-fashioned… He never gives me the answers, he just gives me some signs. He wants me to find out the hard way, even if it takes a month or a year. (Omar)
(9) In EBWL we had live tutorials… It was a Hangout. There was a lecturer from Uni Aachen… The tutor displayed and explained a presentation… We also participated, said our ideas. There was interaction. (Ibrahim)
Some learners expressed a wish for greater direction, and a desire to be noticed and encouraged:
(16) If someone was available and said: “OK we have these courses. Per week, this is how many hours each lesson should take.” I don’t want restrictions, but I want to say a little bit of restriction is useful here… For example, I did not do anything for a year, and ((laughs)) nobody asked me if you were there, or what are you doing? I’m grateful to Kiron, but I still wasted time. (Nj)
Several examples were given of how learners had sought, received and benefited from feedback – from other learners, from quiz or test results, from employers, from friends and from mentors.
Learners often expressed frustration about the gaps in their knowledge:
(19) Sometimes we face new knowledge, which we didn’t get at the university or in life. … I wonder what that means, and so I have to look online or on YouTube or Google … just to find the definition of this knowledge. It takes some time to get the answer. Sometimes the answer is not within my knowledge of how I can understand this communication, and that’s a problem. (Mo)
Social Presence (SP)
Table 2 shows that seven of the depictions related to social presence, with learners focusing mainly on the value of face-to-face interaction and the difficulties of sustaining meaningful interaction with others online. The face-to-face study weekend was mentioned frequently, always in positive terms:
(20) It was nice and I don’t feel that I’m alone at least. You get to know people and you see this is this person, this is that person who commented about something online… (Jasmine)
Online communication was usually depicted as problematic:
(21) There is a tutorial on Hangout I attended… It was good, but I can say it would be better if it was… something constant, not just to say oh hello, how are you. We don’t know each other’s names because our emails are just letters and numbers. (Mo)
However, some positive experiences with online communication were reported:
(25) The other students also ask questions in different ways (in the online tutorials), and I think it makes our mind a little bit bright to understand the topic. (Qadir)
There were several requests for Kiron to facilitate local networking opportunities:
(22) How could I study with other students? I’m so active and motivated when I’m studying with other students. It’s a good solution if they connect us. (Fatimah)
Cognitive Presence (CP)
From Table 3 it can be seen that the learners emphasised the use they had made of different information sources to help them understand course content. The following quote was typical:
It helped when… | I was challenged by… | I would like… | |
CP1 Triggering event | (27) the presentation of course content grabbed my attention. | ||
CP2 Exploration | (28) I used “a variety of information sources to explore problems” (Garrison, 2017, p. 174) | (29) the lack of focused and timely learning activity in the discussion forums and online study groups. | |
CP3 Integration | (30) I talked to other people about what I was learning. (31) I spent time revising the basics in the subject I was studying. |
||
CP4 Resolution | (32) I was able to “apply new knowledge to my work or other non-class activities” (Garrison, 2017, p. 175) |
(28) I remember the (MOOC) professor giving an example of a company and then I stopped the video right away and looked up the company because I was just interested. Or he uses a term… and I would stop the video and look it up… And when he says a whole point and I could go back and replay it, that was very helpful. (Ayoubi)
Learning Presence (LP)
As indicated by Table 4, in terms of learning presence, examples of goal setting and planning abounded in the interviews, e.g.:
(34) I always take lectures and do the reading (on my smartphone) on my way to the office, and the test I always do at night (on my laptop) - it takes one or two hours… In short it is also not easy to complete these courses online without planning… I have also cut my time from watching TV and now spend it on my studies… The routine is necessary to achieve these goals. (Qadir)
Time management was also a significant preoccupation:
(37) The main problem is with time actually… there is no obligation. You had to do it by yourself… I always remind myself of my target, I have to do this and this, so motivation makes self-discipline. (Salim)
Finally, in relation to reflection, some learners were extremely resourceful in finding role models, mentors and opportunities to learn outside of their courses. Examples included learning from job interviews gone wrong or failed startup attempts, striking up conversations with strangers in a library, sending emails to experts identified through an online search, and volunteering at local community events in order to meet potential study mentors.
Discussion
I now return to RQ3: What can we learn about the application of the enhanced CoI framework to the evaluation of the learning experience of refugees (and potentially also other learners in disadvantaged circumstances)?
In the data analysis, the key dimensions of the augmented CoI framework (teaching presence, social presence and cognitive presence) proved useful, as did the subcategories (the labelled sections of the pie in Figure 1), which provided a structure for organising the 43 depictions. However, as noted, of Garrison’s (2017) 34 detailed survey indicators, only four mapped onto the depictions. It is worth noting that Shea et al.’s (2012) learning presence accounted for a full quarter of the statements generated. As an overarching framework, therefore, the main headings and subheadings of the augmented CoI framework enabled a coherent description of the learners’ experience. However, as predicted, some issues were not adequately addressed by the CoI framework. Most importantly, several learners referred to problems that social theorists refer to as “structural” (Giddens, 1986), such as how the distractions of life as a refugee, being separated from one’s family, and feeling uncertain about the future, made it difficult to focus on their studies. These issues manifest as personal problems, and yet this masks power relations within society. As one research participant put it:
As long as you’re getting the support, your family is taking care of you, you can put all your efforts in one direction and you can achieve it… For example, the last… ten days what I’ve been through has been horrific. I didn’t have the support when I was ill. There are lots of things that go through your mind, how am I going to manage that… Sometimes you feel a little bit disappointed because nobody is going to listen to this excuse… - they will see it on the paper that this guy has done this, but this guy couldn’t. (Imran)
The lack of explanatory power within the CoI framework to address such issues reduces its usefulness - echoing Jaffer, Govender and Brown (2017) in this regard. Other aspects that are insufficiently addressed in the CoI framework include factors related to culture and online learning (Bozkurt, Yazici & Aydin, 2018), and how learners’ perceptions of agency (Archer, 2007) affect their decision-making around learning. These would all be viable avenues for future research.
Finally, I consider RQ4: What are the implications of the findings for organisations supporting refugees, and potentially other learners in disadvantaged circumstances, to learn from MOOCs?
In terms of teaching presence, the design of both the overall educational offer and individual courses played a role in learners’ motivation to participate. Materials that were designed to be supportive (e.g. videos with subtitles and accompanying notes) were central to the learners’ sense of progress. The ways in which learners were guided towards understanding of key concepts, and received feedback on their strengths, weaknesses and understanding, was a major theme in the interviews. Responsibility for these functions was distributed widely between the MOOC lecturers (who appeared only in videos); Kiron’s volunteer tutors, mentors and buddies, competition judges, scholarship awarders, employment recruiters, and the learners’ friends, peers and mentors of their own choosing. Automated feedback on quizzes was also appreciated. The prevalence of these varied sources of guidance and feedback in the learners’ narratives points to the centrality of the facilitation role.
In terms of social presence, opportunities for face-to-face interaction were coveted. Participation in public MOOC “meet-ups” in learners’ local areas, using a facility such as Meetup.com, might address this perceived lack for some learners. Regarding online social presence, learners were reluctant to invest time and effort into discussion forums and learner-led online study groups, finding them generally lacking in focused and timely activity. Communication via these tools was also seen as impersonal. Clearly, social presence is not being sufficiently fostered in the MOOCs that these learners participated in.
In terms of cognitive presence, the research participants seemed particularly adept at finding and utilising resources to supplement the courses and fill gaps in their knowledge. However, the full cycle of “trigger, exploration, integration and resolution” was not articulated by any of the participants. This may be because the interview questions did not specifically elicit it, but it may also point to a lack of focus by MOOC designers on the most critical element of the learning experience.
As for learning presence, the learners had a rich array of individual strategies to draw on in enabling them to regulate their learning through goal setting and planning, and they discussed these at some length in the interviews. This suggests that a knowledge exchange between learners on learning strategies (perhaps even in the form of a MOOC) would be helpful for many learners.
Conclusion
The main limitation of this study relates to the sample size of ten research participants. A sample size of 12 has been demonstrated to lead to data “saturation” (Guest, Bunce & Johnson, 2006) in categorical analysis, although in the context of this case study, a much larger sample size may be called for, considering the diversity within the refugee community, and the myriad of factors that could affect learning from MOOCs. A second limitation was that I carried out the investigation alone, which was a necessary condition of this as an output of my PhD, and whilst I tried to be meticulous in my categorising, there was no interrater scoring process to confirm reliability of findings. Despite these limitations, the study demonstrates that the Community of Inquiry framework, augmented with the construct of Learning Presence, is useful as a partial model for analysing the learning experience of learners in disadvantaged circumstances on MOOCs, although it does not account for the impact that structure and agency have on the learning process. In conclusion, the Kiron learners’ depictions of online learning presented here should dispel the myth that MOOCs are only suitable for privileged learners with substantial experience of higher education, while also offering insights for organisations that want to widen participation in higher education through MOOCs.
Acknowledgements
I would like to thank the Kiron learners, staff and volunteers who participated in this study, and David Hawkridge for his feedback on an early draft. I also acknowledge the support of academic staff and peers on the Doctoral Programme in Higher Education Research, Evaluation and Enhancement at Lancaster University from which this publication has arisen: http://www.lancaster.ac.uk/educational-research/phd/phd-in-higher-education-research,-evaluation-and-enhancement/
References
Refbacks
- There are currently no refbacks.