Theoretical and empirical evidence in the learning sciences substantiates the view that deep engagement in learning is a function of a complex combination of learners’ identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. To date, the learning analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by learning analytics. We report progress in the design and implementation of learning analytics based on a research validated multidimensional construct termed “learning powerâ€. We describe, for the first time, a learning analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of analytics that it supports from real time summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners. We conclude by summarising the ongoing research and development programme and identifying the challenges of integrating traditional social science research, with learning analytics and modelling.
"1. INTRODUCTION. Information infrastructure embodies and shapes worldviews. The work of Bowker and Star [1] elegantly demonstrates that the classification schemes embedded in information infrastructure are not only systematic ways to capture and preserve—but also to forget, by virtue of what remains invisible. Moreover, the user experience foregrounds certain information, thus scaffolding particular forms of human-computer and human-human interaction, which in turn promotes or obstructs sensemaking [2]. Learning analytics and recommendation engines are no exception: they are designed with a particular conception of ‘success’, thus defining the patterns deemed to be evidence of progress, and hence, the data that should be captured. A marker of the health of the learning analytics field will be the quality of debate around what the technology renders visible and leaves invisible, and the pedagogical implications of design decisions, whether the design rationale is explicit or implicit. In this paper we focus on the challenge of designing learning analytics that render visible learning dispositions and the transferable competencies associated with skillful learning in diverse contexts. These are dimensions of learning that both research and practice are demonstrating to be increasingly important, but which the learning analytics field has yet to engage with deeply. Mastery of discipline knowledge as defined by an explicit curriculum is obviously a critical yardstick in learning, and it is not surprising that currently, this is the dominant focus of most learning analytics research and product development, since this is the dominant paradigm in educational institutions. We know that this is greatly assisted when aspects of the domain and learner can be modelled: user models compare the inferred cognitive model against an ideal model (intelligent tutoring, eg. [3]); presentation layers may tune content dynamically if progress is deemed to be too slow (adaptive educational hypermedia, e.g. [4]); data mining techniques can be deployed, which usually assume the goal is to pass the course (e.g. [5]). In a different part of the learning analytics design space, we see the use of generic learning management systems that are agnostic as to the subject matter (and indeed have only a rudimentary model of the domain, if any). The trend to generic platforms is accompanied by their disaggregation, as open, social platforms, managed by many entities, are used for informal, self-directed learning, sometimes around the edges of formal courses. Learning analytics in these contexts must address a very different learning context, in which the domain, learning objectives, learner cohort and course materials may all be unknown in advance, and may not be controllable (Massive, Online, Open Courses – MOOCs – may be the extreme instance). Converging with these technology-driven trends, is traditional social science research into the personal qualities that enable effective learning across contexts. There is substantial and growing evidence within educational research that learners’ orientation towards learning—their learning dispositions— significantly influence the nature of their engagement with new learning opportunities, in both formal and informal contexts. Learning dispositions form an important part of learning-to-learn competences, which are widely understood as a key requirement for life in the 21st Century. Despite this, employers complain increasingly that many graduates from our school and university systems, while proficient at passing exams, have not developed the capacity to take responsibility for their own learning and struggle when confronted by novel, real world challenges [6]. In this paper we argue that by combining extant research findings from the social science field of education, particularly concerning engagement in learning and pedagogy, with the affordances of learning analytics, we can develop learning platforms that more effectively catalyse the processes of learning for individuals and collectives. We introduce the concept of meta-competencies (§2) as one of several approaches to characterising the demands on learners made by today’s society, and we note the escalating problem of school disengagement (§3). We then summarise some of the core insights in the literature around engagement and learning dispositions (§4), before explaining the use of self-report as a means of gathering dispositional data (§5). In §6 we introduce Learning Power, a multi-dimensional construct for modelling learning dispositions, which has been under development and validation for over a decade, but in this paper we present for the first time the Learning Warehouse platform which underpins it (§7). This generates a visual analytic spider diagram for individuals, which renders the underlying model (§8), plus cohort summary statistics which can inform pedagogical intervention. In §9 we consider qualitative, quantitative and narrative ways to validate dispositional analytics of this sort, including evidence that the visual analytic has pedagogical affordances which build learners’ self-awareness. We also provide examples of how the analytics platform facilitates deeper analyses within and across datasets. §10 summarises four key forms of service that the platform is facilitating which help to close the research-practice gap. We conclude by summarising the contributions that this research makes (§11), and outlining some of the avenues now being pursued (§12). 2. META-COMPETENCIES. Where formal learning is highly specialised and discipline bound, very often graduates, including those with traditional degrees in ‘vocational’ subjects like engineering or law, find themselves with jobs in which they cannot make much use of whatever specialist knowledge they possess [7]. The acquisition of subject matter knowledge is no longer enough for survival and success in a society characterized by massive data flows, an environment in constant flux, and unprecedented levels of uncertainty (e.g. around how socio-technical complex systems will behave, and around what can or should be believed a true, or ethically sound). What is needed in addition is the ability to identify and nurture a personal portfolio of competencies that enable personal and collective responses to complex challenges. We understand competence as a combination of knowledge, skills, understanding, values, attitudes and desires, which lead to effective, embodied human action in the world, in a particular domain. Skillful accomplishment in authentic settings requires not only mastery of knowledge, but the skills, values, attitudes, desires and motivation to apply it in a particular socio-historical context, requiring a sense of agency, action and value [8]. Writing from the perspective of education, Haste summarises competencies required for 21st century survival. She identifies one overarching ‘meta-competence’ which is the ability to manage the tension between innovation and continuity, and argues that this is constituted in five sub-competences: the ability to (i) adaptively assimilate changing technologies (ii) deal with ambiguity and diversity (iii) find and sustain community links (iv) manage motivation and emotion and (v) enact moral responsibility and citizenship. To be competent in this richer, more expansive sense, the ‘possession’ of knowledge is necessary but not sufficient. Also required are personal qualities and dispositions, a secure-enough sense of identity and purpose, and a range of new skills that enable links to be made across domains and processes. Bauman has argued that deep engagement in learning is particularly important today for two reasons [9]. Firstly, as many school and university teachers will recognise, there is a contemporary search for identity in today’s fluid, globalised society, and secondly, “educational philosophy and theory face the unfamiliar and challenging task of theorising a formative process which is not guided from the start by the target form designed in advance†(p.139). That is, as we transition increasingly to a world where relevant ‘outcomes’ in a real world context can no longer be pre-determined with the confidence of earlier times, and where a learner’s intrinsic capacity to rise and adapt to a challenge is a highly valued trait, we need a theory and practice of engagement in learning that facilitates the formation of identity, combined with scaffolding the processes of knowledgecreation and authentic performance. Thomas and Seely Brown [10] argue for the need to embrace a theory of “learning to become†(p.321) in contrast to theories that see learning as a process of becoming something. They argue that the 20th century worldview shift from learning as transmission to learning as interpretation, is now being replaced by learning as participation, fuelled by structural changes in the way communication happens through new technologies and media. Participation is embodied and experienced, and critically, requires “indwellingâ€: The potential revolution for learning that the networked world provides is the ability to create scalable environments for learning that engages the tacit as well as the explicit dimensions of knowledge. The term we have been using for this, borrowed from Polanyi, is indwelling. Understanding this notion requires us to think about the connection between experience, embodiment and learning. [10] (p.330) 3. LEARNER DISENGAGEMENT. The development of the above kinds of competencies presents a challenge for policies and pedagogies that validate learning solely in terms of standardised outcomes—designed (as are all analytics) to facilitate the generation of certain kinds of insight, for certain kinds of stakeholders. An over-emphasis on these indices is in tension with the need to take into account the complexity of learners’ sense of identity and their whole attitude to learning. If learners are, for whatever reason, fundamentally not disposed to learn, then extrinsic drivers around exam performance are unlikely to succeed. As Dewey (1933) observed: Knowledge of methods alone will not suffice: there must be the desire, the will, to employ them. This desire is an affair of personal disposition. [11] (p.30) Rising disengagement is a problem in many developed countries’ education systems. Research undertaken for the English Department for Education [12] reported in 2008 that 10% of students “hate†school, with disproportionate levels amongst less privileged learners (however, highly engaged students from poor backgrounds tend to outperform disengaged students from wealthy backgrounds). The Canadian Education Association regularly surveys student attitudes to school, reporting in 2009 that intellectual engagement falls during the middle school years and remains at a low level throughout secondary school [13]. A 2009 US study across 27 states reported that 49% students felt bored every day, 17% in every class [14]. These disturbing data point to a widening disconnect between what motivates and engages many young people, and their experience of schooling. This is serving as a driver for action research into new models focused on the wholistic design of learning, catalysing academics [15-18] and national schools networks (e.g. the UK’s WholeEducation.org). How can learning analytics research and development engage with this challenge? Certainly, there is a contribution to be made by providing more detailed, more timely information about performance—but while dismal analytics will help educators, their impact on already disengaged learners might be counterproductive. We propose that ‘disposition analytics’ could spark intrinsic motivation by giving learners insight into how they approach learning in general, and how they can become more skillfully equipped for many other aspects of their lives beyond school. We construe this challenge as one of defining, measuring, modelling and formalizing computationally the constructs associated with learning dispositions. 4. DEFINING DISPOSITIONS. What we are seeking to track, and model for analytics purposes, is a set of dispositions, values and attitudes that form a necessary but not sufficient, part of a learning journey. Figure 1 summarises this conceptualisation of learning dispositions, values and attitudes. This is a complex and embedded journey because it takes seriously the social, historical, cultural and personal resources that shape, and are shaped by, people’s behaviour and dispositions. Learning dispositions are personal, and autogenic. On the one hand they reflect ‘backwards’ (the ‘personal’ left side of Figure 1) to the identity, personhood and desire of the learner, and on the other hand, they can be skilfully mobilised to scaffold ‘forwards’ towards the acquisition of the knowledge, skills and understanding necessary for individuals to develop into competent learners (the ‘public’ right side of Figure 1). Competence in learning how to learn requires agency, intention and desire, as well as the dispositions or virtues necessary to acquire the skills, strategies and knowledge management necessary for making the most of learning opportunities over a lifespan, in the public domain. Although the term ‘disposition’ is imprecise, both theoretically and in practice, it is widely agreed that it refers to a relatively enduring tendency to behave in a certain way [19]. It is a construct linked to motivation, affect and valuing, as well as to cognitive resources [20-24]. Dispositions may be culture specific as well as a relatively enduring feature of personality. A disposition arises from desire, or motivation, which provides the energy necessary for action [17, 25-27]. A disposition can be identified in the action a person takes in a particular situation – for example someone who is disposed to be ‘curious’ will demonstrate this in the manner in which they consistently generate questions and investigate problems. In practice, in education the term is often used interchangeably with ‘competence’ or ‘style’ or ‘capability’, and it is frequently subsumed within the concept of ‘personal development’ as distinct from academic development or attainment. There are many dispositions which are relevant for education – ranging from the specific to the very general, with varying conceptions as to how fixed or malleable they are. Our focus is on malleable dispositions that are important for developing intentional learners, and which, critically, learners can recognise and develop in themselves. 5. MEASURING DISPOSITIONS. Learning analytics cannot operate without data. For some approaches, this data is a by-product of learner activity, ‘data exhaust’ left in the online platform as learners engage with each other and learning resources. Other approaches depend on users self-disclosing ‘metadata’ about themselves intentionally, knowing that it will be sensed and possibly acted on by people or machines, known and unknown to them. Such ‘intentional metadata’ typically discloses higher order information about one’s state or intentions, which are harder to infer from low-level system event logs. Examples of higher order metadata would include emotional mood during one’s studies, the decision to ‘play’ with an idea or perspective, or setting out to build one’s reputation in a group. These might be disclosed in twitter-style updates, blog posts, comments in a meeting, written work and responses to quizzes/questionnaires. In looking to future research at the end, we signal new work on inferring dispositions from the ‘exhaust’ traces that learners leave in online environments, but the focus of this paper is on self-reported data gathered via a selfdiagnostic ‘quiz’ (the research-validated ELLI survey introduced below). Figure 1: Dispositions as a personal attribute, embedded in a learning journey oscillating between personal and public. Self-report is a standard means of gathering data in the social sciences about an individual’s values, attitudes and dispositions, partly because of the challenges of observation at scale in nondigital environments, and partly because, however astute the observer may be, what a person thinks or feels is by definition idiosyncratic and cannot be confirmed only by the external behaviours and artifacts: take for example an engaged, motivated learner, with low academic ability, who may produce a lower graded piece of work than a bored, disengaged ‘high achiever’ who submits something they have no personal interest in. From the perspective of a complex and embedded understanding of learning dispositions, what learners say about themselves as learners is important and indicative of their sense of agency and of their learning identity (indeed at the personal end of the spectrum in Figure 1, authenticity is the most appropriate measure of validity). 6. MODELLING DISPOSITIONS. Learning Power is a multi-dimensional construct that has come to used widely in educational contexts in the last ten years. It is derived from literature analysis, and interviews with educational researchers and practitioners about the factors, which in their experience, make good learners. The seven dimensions which have been identified harness what is hypothesised to be “the power to learn†— a form of consciousness, or critical subjectivity [28], which leads to learning, change and growth. I like to learn about things that really matter to me. I like it when I can make connections between new things I am learning and things I already know. I like learning new things when I can see how they make sense for me in my life. Dependence and Fragility: Dependent and fragile learners more easily go to pieces when they get stuck or make mistakes. They are risk averse. Their ability to persevere is less, and they are likely to seek and prefer less challenging situations. The opposite pole of dependence and fragility is ‘resilience’. When I have trouble learning something, I tend to get upset. When I have to struggle to learn something, I think it’s probably because I’m not very bright. When I’m stuck I don’t usually know what to do about it. Creativity: Effective learners are able to look at things in different ways and to imagine new possibilities. They are more receptive to hunches and inklings that bubble up into their minds, and make more use of imagination, visual imagery and pictures and diagrams in their learning. The opposite pole of creativity is ‘being rule bound’. An extensive literature review informed the development of a self-report questionnaire called ELLI (Effective Lifelong Learning Inventory) whose internal structure was factor analysed, and validated through loading against seven dimensions [28]. As detailed later, these dimensions have been since validated with diverse learner groups, ranging in age from primary school to adults, demographically from violent young offenders and disaffected teenagers, to high achieving pupils and professionals, and culturally from middle-class Western society to Indigenous communities in Australia. The term learning power has been used to describe the personal qualities associated with the seven dimensions, particularly by Claxton [29, 30], although in this paper, its meaning is specifically related to the ELLI inventory. Learning Relationships: Effective learners are good at managing the balance between being sociable and being private in their learning. They are not completely independent, nor are they dependent; rather they work interdependently. The opposite pole of learning relationships is ‘isolation and dependence’. The inventory is a self-report web questionnaire comprising 72 items in the schools version and 75 in the adult version. It measures what learners say about themselves in a particular domain, at a particular point in time. A brief description of the seven dimensions is set out below, with three examples from the questionnaire shown for each dimension: Strategic Awareness: More effective learners know more about their own learning. They are interested in becoming more knowledgeable and more aware of themselves as learners. They like trying out different approaches to learning to see what happens. They are more reflective and better at self-evaluation. The opposite pole of strategic awareness is ‘being robotic’. Changing & learning: Effective learners know that learning itself is learnable. They believe that, through effort, their minds can get bigger and stronger, just as their bodies can and they have energy to learn (cf. [22]). The opposite pole of changing and learning is ‘being stuck and static’. I expect to go on learning for a long time. I like to be able to improve the way I do things. I’m continually improving as a learner. Critical curiosity: Effective learners have energy and a desire to find things out. They like to get below the surface of things and try to find out what is going on. The opposite pole of critical curiosity is ‘passivity’. I don’t like to accept an answer till I have worked it out for myself. I like to question the things I am learning. Getting to the bottom of things is more important to me than getting a good mark. Meaning Making: Effective learners are on the lookout for links between what they are learning and what they already know. They like to learn about what matters to them. The contrast pole of meaning making is ‘data accumulation’. I get my best ideas when I just let my mind float free. If I wait quietly, good ideas sometimes just come to me. I like to try out new learning in different ways. I like working on problems with other people. I prefer to solve problems on my own. There is at least one person in my community who is an important guide for me in my learning. If I get stuck with a learning task I can usually think of something to do to get round the problem. If I do get upset when I’m learning, I’m quite good at making myself feel better. I often change the way I do things as a result of what I have learned. 7. LEARNING WAREHOUSE PLATFORM. Without a learning analytics platform, it is impossible to gather ELLI data globally, with quality and access controls in place, and generate analytics fast enough to impact practice in a timely manner. ELLI is hosted (with other several other researchvalidated tools) within a learning analytics infrastructure called the Learning Warehouse. A mature analytics infrastructure needs not only to gather and analyse data, but orchestrate the tools offered to different stakeholders, and manage data access permissions in an ethical manner. Learners, trainers/educators, researchers, and organisational administrators and leaders are provided with customised organisational portals onto the Learning Warehouse which offers them different tools and levels of permission to datasets as follows: learners sign in to complete the right version of the ELLI questionnaire (e.g. Child or Adult) and receive their personal ELLI visual analytic (detailed in next section); administrators can upload additional learner metadata or datasets; educators/organisational leaders access individual and cohort analytics, scaling to the organisation as a whole if required, in the form of visualised descriptive statistics. Authorised researchers can see all of the above, together with other datasets depending on the bases on which they were gathered. The portals also house learner identity metadata, held separately from the survey data in the Learning Warehouse, and destroyed after use. Learning Warehouse uses the JSR-268 portlet standard1 enabling ELLI profiles to be written and read via external platforms. The researcher interface is provided for querying within and across the anonymised datasets (at time of writing >40,000 cases). Where a data owner requires analysis involving identifiable data, the researchers are given permission to access this from the user portal, held on a different server, for the purposes of the specific project. The researcher interface enables access to aggregated, anonymised datasets over learner cohorts and across tools for researchers, with appropriate permissions within strict ethical guidelines. Researchers are then able to undertake system-wide research on a range of cases, across jurisdictions, instruments and domains, and can curate the data generated from it and make it available for secondary data analysis. Raw data may be downloaded for analysis in Excel and SPSS, with a unique identifier enabling integration of datasets, and in some cases matching with nationally procured datasets. Up to this point, the use of data has fallen within the traditional social science domain in the way that it is used, as well as in the pedagogical domain through providing immediate, visual feedback for learners to appropriate and use in improving their learning power. The next step which we are now beginning to explore is a more integrated researcher experience, which incorporates tools more familiar to the learning analytics world, for example by providing web-based visual analytics tools for querying and interactively exploring data, drawing inspiration from user experiences such as Google Analytics and Gapminder.3 A second development emerges from recent work with collaborative, social applications, which are generating new kinds of data streams at a finer granularity than a complete ELLI questionnaire, and in real time rather than several months apart (e.g. the start and end of a conventional educational research project). We introduce this under future work. 8. ELLI VISUAL ANALYTICS. Visual analytics are helpful when it comes to comprehending and discussing a 7-dimensional construct such as learning power. On completion of an ELLI web survey, the Learning Warehouse generates a spider diagram (Figure 2), providing a visualization for the learner to reflect on (their own perception of) their learning power. The scores produced are a percentage of the total possible score for that dimension. The spider diagram graphically depicts the pattern and relative strength of individual scores. Note that unlike most spider diagrams, the axes are not numbered, but labeled A little like me, Quite like me, and Very much like me. As discussed shortly, a visual analytic such as this has a number of important properties, which can be both empowering, but also potentially demoralizing, and it is a principle behind the approach that learners are not left to ponder its meaning alone. It is crucial that the learner validates and thus ‘owns’ the profile, a matter for the coaching conversation that follows with a trained mentor. Figure 2: An ELLI spider diagram generated from the Learning Warehouse. The shaded blue region shows the initial profile, while the outer red profile indicates ‘stretch’ on certain dimensions later in the learning project. Data can be aggregated across groups of learners in order to provide a mentor or teacher with a view of the collective profile on all or specific learning power dimensions (Figure 3). Figure 3: Visual analytics on aggregate ELLI data, for all learning power dimensions, and a specific dimension. The spider diagram has been further extended through the use of visual imagery, creating a culturally relevant character to represent each dimension. Examples (Figure 4) include the Simpsons cartoon characters when working with disaffected English teenagers [31], and iconic animals for Australian indigenous young people [32]. Figure 4: Adding visual symbols to Learning Power dimensions to localise them culturally. Top example also shows the addition of metaphorical ‘zones’ to the dimensions, to create mental spaces for learners to inhabit. 9. VALIDATION. Thorough validation of a learning analytic is a multi-faceted challenge. In this section we describe some of the facets relevant to a dispositional analytic such as ELLI. 9.1 Construct validity of ELLI. When analysing self-report data there are several ways of ascertaining reliability and validity. Sample size is important, with larger numbers giving greater confidence in standard statistical tests of reliability that explore how the instrument operates in repeated tests. ‘Construct validity’ refers to whether the tool measures what it was designed to measure, for which there exist well established criteria in the social sciences. The reliability and validity of ELLI has been reported in several peer reviewed educational publications [33-35]. 9.2 Correlation with standardized attainment. Intuitively, one might hypothesise that learners who are curious, resilient, creative and strategic (i.e. in the terms of this paper, demonstrating learning power) should also record higher attainment in traditional tests, because they have, for instance, a much greater desire to learn, and ability to stretch themselves. This argument is made strongly by proponents of learning-to-learn who argue for such approaches to be woven into teaching practice rather than being consigned to be taught as special topics (e.g. [30]). The evidence for this remains inconclusive, to date. Consistent with this line of thought, one would predict ELLI to correlate positively with conventional attainment analytics, and indeed, several studies do report a positive correlation [33, 36, 37]. This is an intriguing finding, but this relationship requires further interrogation: it might also be argued that more developed learning power should correlate negatively with higher test scores. For instance, an analysis of reliability and validity statistics for ELLI (N=10496) in 2008, replicated a 2004 finding that the mean score on students learning power profiles gets significantly lower as students get older [35]. This takes us back to the earlier data reviewed on school disengagement: it points to a widening disparity between the dispositions that reflect learners taking skillful responsibility for their own learning in authentic contexts, and the demands of curricula and associated assessment regimes that focus on test results gathered under artificial conditions. 9.3 Pedagogical validity of ELLI profiles. In information visualization, visual analytics are judged in terms of the qualities of information design. We would argue that visual learning analytics should go one step further: when they are intended to empower learners we need to understand their pedagogical affordances — the insight yielded for both educators, and learners. In school, workplace and Masters programmes, educators are trained how to use individual and cohort ELLI profiles to shape interventions and classroom practice, but space precludes a detailed report on this. We focus here on the methodological question of how does one validate the pedagogical affordances of the ELLI profile for learners, where the objective is to catalyse changes in dispositions towards learning? Evidence of personal change is gathered using a mixed-methods approach combining quantitative pre- and post-test measures of learning power, plus qualitative and narrative evidence from student interviews. This has proven to be a powerful means of triangulating and validating evidence of impact, and communicating the findings [31, 32, 36, 38-41]. In one 2007 study in a UK school [39], quantitative analysis showed significant changes between pre- and post-measures across a whole year group; qualitative evidence identified key themes, and narrative evidence provided an ‘insiders’ perspective on the experience, such as the following statements from two 15 and16 year old students: It’s (about) understanding – because you can pass exams without understanding…..It’s self growth and achievement…. Our personal experience is important….Learning to tell your own story would make it easier to do all the other things you have to do – learn subjects, get grades etc… When I was a child…I was always much keener to do something if I knew I would get a reward at the end of it….. the performance was important and not the process… and that’s the way the education system works… it’s very results driven… It’s a bit of a trust thing…. they don’t trust you to do it in your own way….its a trust thing… It all ties together – its about self awareness more than anything else ….. self awareness is not even touched upon in the education system… In another project with NEET learners (“Not in Employment, Education or Trainingâ€), a 16 year old made significant changes in his pre-post profile through "
About this resource...
Visits 110
Categories:
0 comments
Do you want to comment? Sign up or Sign in