Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students’ performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics Toolkit, which enables teachers to explore and correlate learning object usage, user properties, user behavior, as well as assessment results based on graphical indicators. The primary aim of the development of eLAT is to process large data sets in microseconds with regard to individual data analysis interests of teachers and data privacy issues, in order to help them to self-reflect on their technology-enhanced teaching and learning scenarios and to identify opportunities for interventions and improvements.
"Introduction. Learning Management Systems (LMS) or Virtual Learning Environments (VLE) are widely used and have become part of the common toolkits of educators (Schroeder, 2009). One of the main goals of the integration of traditional teaching methods with technology enhancements is the improvement of teaching and learning quality in large university courses with many students. But does utilizing a VLE automatically improve teaching and learning? In our experience, many teachers just upload existing files, like lecture slides, handouts and exercises, when starting to use a VLE. Thereby availability of learning resources is improved. For improving teaching and learning it could be helpful to create more motivating, challenging, and engaging learning materials and e.g., collaborative scenarios to improve learning among large groups of students. Teachers could e.g., use audio and video recordings of their lectures or provide interactive, demonstrative multimedia examples and quizzes. If they put effort in the design of such online learning activities, they need tools that help them observe the consequences of their actions and evaluate their teaching interventions. They need to have appropriate access to data to assess changing behaviors and performances of their students to estimate the level of improvement that has been achieved in the learning environment. With the establishment of TEL, a new research field, called Learning Analytics, is emerging (Elias, 2011). This research field borrows and synthesizes techniques from different related fields, such as Educational Data Mining (EDM), Academic Analytics, Social Network Analysis or Business Intelligence (BI), to harness them for converting educational data into useful information and thereon to motivate actions, like self-reflecting ones previous teaching or learning activities, to foster improved teaching and learning. The main goal of BI is to turn enterprise data into useful information for management decision support. However, Learning Analytics, Academic Analytics, as well as EDM more specifically focus on tools and methods for exploring data coming from educational contexts. While Academic Analytics take a university-wide perspective, including also e.g., organizational and financial issues (Campbell & Oblinger, 2007), Learning Analytics as well as EDM focus specifically on data about teaching and learning. Siemens (2010) defines Learning Analytics as “the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning.†It can support teachers and students to take action based on the evaluation of educational data. However, the technology to deliver this potential is still very young and research on understanding the pedagogical usefulness of Learning Analytics is still in its infancy (Johnson et al., 2011b; Johnson et al., 2012). It is a current goal at RWTH Aachen University to enhance its VLE—the learning and teaching portal L²P (Gebhardt et al., 2007)—with user-friendly tools for Learning Analytics, in order to equip their teachers and tutors with means to evaluate the effectiveness of TEL within their instructional design and courses offered. These teachers still face difficulties, deterring them from integrating cyclical reflective research activities, comparable to Action Research, into everyday practice. Action Research is characterized by a continuing effort to closely interlink, relate and confront action and reflection, to reflect upon one’s conscious and unconscious doings in order to develop one’s actions, and to act reflectively in order to develop one’s knowledge.“ (Altrichter et al., 2005, p. 6). A pre-eminent barrier is the additional workload, originating from tasks of collecting, integrating, and analyzing raw data from log files of their VLE (Altenbernd-Giani et al., 2009). To tackle these issues, we have developed the “exploratory Learning Analytics Toolkit†(eLAT). The main aim of eLAT is to support reflection on and improvement of online teaching methods based on personal interests and observations. To help teachers reflect on their teaching according to their own interests, the desired Learning Analytics tool is required to provide a clear, simple, easily interpretable and usable interface while, at the same time, being powerful and flexible enough for data and information exploration purposes. Therefore, eLAT was designed to enable teachers to explore and correlate content usage, user properties, user behavior, as well as assessment results based on individually selected graphical indicators. Dyckhoff et al. (2011) gave a short overview of the toolkit. In the remainder of this paper, we present the theoretical background, design, implementation, and evaluation results of eLAT in more detail. In section 2 (theoretical background), we provide information on the theoretical background and briefly describe the results of a requirements analysis and its implications for Learning Analytics tools. In section 3 (eLAT: exploratory Learning Analytics Toolkit), we discuss the design, implementation, and evaluation of eLAT. In section 4 (Related Work), we compare our approach to state-of-the-art solutions. Even though there are some approaches to support teachers in their ongoing evaluation and improvement activities (e.g., Johnson et al., 2011a; GarcÃa-Saiz & Zorilla PantaLeón, 2011; Mazza & Dimitrova, 2007; Pedraza-Perez et al., 2011), many challenges remain (Chatti et al., 2012). Examples include integration with other VLEs and integration of diverse data sources, minimizing the time delay between the capture and use of data, consideration of data privacy issues, protection of students’ identities and prevention of data misuse, enabling data exploration and visualization manipulation based on individual data analysis interests, providing the right information to the right people right away, and investigating which captured variables may be pedagogically meaningful (Elias, 2011; Johnson et al, 2011b). By developing eLAT, we have tried to tackle these challenges. The final section 5 (Conclusion and Outlook) gives a summary of the main findings of this study according to the challenges mentioned above and outlines perspectives for future work. Theoretical background. In TEL, masses of data can be collected from different kinds of student actions, such as solving assignments, taking exams, online social interaction, participating in discussion forums, and extracurricular activities. This data can be used for Learning Analytics to extract valuable information, which might be helpful for teachers to reflect on their instructional design and management of their courses. Usable EDM and Learning Analytics tools for teachers that support cyclical research activities are still missing in most current VLE or are far from satisfactory (Hijon & Carlos, 2006). Romero et al. state “[…] data mining tools are normally designed more for power and flexibility than for simplicity. Most of the current data mining tools are too complex for educators to use and their features go well beyond the scope of what an educator might require†(Romero et al., 2007, p. 369). If tracking data is provided in a VLE, it is often incomprehensible, poorly organized, and difficult to follow, because of its tabular format. As a result, only skilled and technically savvy users can utilize it (Mazza & Dimitrova, 2007). But even for them it might be too time consuming. Moreover, unnecessary personal information of students can be observed by teachers or even fellow students, i.e., data privacy issues are ignored in the design of most VLE (Loser & Herrmann, 2009). Legal issues have to be taken into account to prevent malpractice. In Germany, for example, teachers are not allowed to have access to everything a student does in their online courses. They are only supposed to have access to data that is relevant for teaching in a form that is transparent to students (Directive 95/46/EC, 1995; Federal Data Protection Act, 1990). However, many research questions of teachers are concerned with general learning processes of a whole group of students in contrast to gaining more knowledge about a single student. Therefore, to ensure data privacy, student data could be pseudonymized in a preprocessing step by using, e.g., a hash instead of a student-ID, and by presenting summarized results in form of visualizations that rather show group processes and do not allow to focus on one student. Further deficiencies of reporting tools are related to usability and clarity as well as completeness of the delivered results, such as the lack of possibilities to integrate results of online questionnaires with data from logs. Many teachers are motivated to evaluate their courses and they already have research questions related to their teaching in mind. For example, a teacher who offers weekly online exercises has the intention to help her students to prepare for an exam. But she is not sure if the currently available exercises are helpful enough for this purpose. Therefore, the teachers would like to know if those students who practice with her online exercises on a continually basis are better in the final exam than students who do not use them. A Learning Analytics toolkit could help her to do research on this hypothesis by automatically collecting, analyzing, and visualizing the right data in an appropriate way. Yet, most monitoring and reporting tools found in current VLEs are designed to collect, analyze, and visualize data in a static tabular form that was predefined by system developers. Teachers face the difficulty that appropriate and usable Learning Analytics tools that help them answer their individual questions continuously and efficiently are missing in prevalent VLEs, since most of the work in the area of Learning Analytics is conceptual (Johnson et al., 2011b). Teachers should have access to Learning Analytics tools (e.g., provided via dashboards) that can be integrated into a VLE or other learning environments. These tools should allow for interactive configuration in such a way that its users could easily analyze and interpret available data based on individual interests. Results of Learning Analytics and EDM should be presented in a clear and understandable format, e.g., information visualizations that are understandable without data mining expert knowledge. Card et al. (1999) define the term visualization as “the use of computer-supported, interactive, visual representations of data to amplify cognition.†It “promises to help us speed our understanding and action in a world of increasing information volumes†(Card, 2003, p. 542). It has been widely argued in the EDM literature that teachers can grasp the information more easily and quickly when presented through comprehensible information visualizations. Mazza and Dimitrova (2007, p. 138), for instance, write: “…the effectiveness of [Course Management Systems] can be improved by integrating [Information Visualization] techniques to generate appropriate graphical representations ….†Also, results of a recent study by Ali et al. (2012) showed that “visualization can be an effective mean to deal with larger amounts of data in order to sustain the cognitive load of educators at an acceptable level†(p. 486) and “multiple ways of visualizing data increase the perceived value of different feedback types†(p. 488). Still, it should be noted that in some analytical cases visualizations could be ineffective with respect to a textual or a tabular interface, e.g., if details about many items are very important. In our approach, we only indicate certain facts about the usage and properties of the learning environment and try to visualize them appropriately. Therefore, we revert to the concept of indicators, which can be described as specific calculators with corresponding visualizations, tied to a specific question. For example, if the teacher’s question is “Are those students who practice with online exercise on a continually basis are better in the final exam than students who do not use them,†the corresponding indicator could show a chart that quickly facilitates a visual data comparison. Indicator concepts have been used before. Glahn (2009) for example, introduced the concept of smart indicators, which he defined as “a context aware indicator system, which dynamically aligns data sources, data aggregation, and data presentation to the current context of a learner†(Glahn, 2009, p. 19). However, in our case the target group differs. The eLAT indicators are collecting and visualizing data of students to present them to teachers. A typical Learning Analytics process is depicted in figure 1. The process starts with the data-gathering step. In this step, data is collected from different learners’ activities when they interact with learning elements within a VLE, LMS or a personal learning environment (PLE). Examples of these activities include participation in collaborative exercises, writing a forum post or reading a document. In the data collection step, it is crucial to address data privacy issues. Often the output of the data extraction and preprocessing step is transferred into a separate database. The second step of the Learning Analytics process is the mining of the preprocessed data, based on different mining techniques, such as clustering, classification, association rule mining, and social network analysis. Thereafter, the results of the mining process can be presented as a widget, which might be integrated into a VLE, a dashboard, or a PLE. Based on appropriate graphical visualizations of the analyzed data, teachers are supposed to be able to more quickly interpret the visualized information, reflect on the impact of their teaching method on the learning behavior and performance of their students, and draw first conclusions about the effectiveness of their teaching, i.e., consider if their goals have been reached. Furthermore, unexpected findings should motivate them to iteratively improve their teaching interventions. However, having a graphical visualization does not guarantee that teachers will be able to interpret the information represented correctly. Indicators must be designed and evaluated carefully. Also, the system should provide instructions for interpretation. Figure 1. The Learning Analytics Process. Requirements for developing such dedicated systems have been collected in a former study by analyzing interests and needs of the target group in more detail (Dyckhoff, 2010). Results of this study showed that teachers already have various questions about their instructional design and the utilization of learning materials, the students’ learning behaviors and correlations between objects of teaching and learning as well as outcomes. Their intentions can be e.g., to find out how well the overall instructional design is appreciated, to learn more about the needs of all or a specific group of students, or to better understand learning processes in general. The conclusion from the study mentioned above was that Learning Analytics tools should support teachers by collecting, integrating, and analyzing data of different sources as well as by providing a step-by-step guidance including semi-automated processes, instead of just presenting large tables of data. “It is undisputable that statistics in isolation represent only one aspect of any real-world situation. To make more meaningful interpretations, [educators] often need to look at the two or more statistics together†(Ali et al., 2012, p. 484). Hence, teachers should be able to choose from a flexible and extendable set of indicators. The system should guide the user throughout the research process, help him or her form research questions, recommend and provide appropriate methods for data collection, integrate data from different sources, and support its collaboratively organized analysis. Such a Learning Analytics tool could e.g., provide extendable lists of supported research questions (indicators) and suitable qualitative as well as quantitative methods for data collection, visualization and analysis. Furthermore, it should be possible to use and integrate the tool with any kind of VLE and learning software. eLAT: exploratory Learning Analytics Toolkit. In the following sections, we introduce eLAT by giving an overview about results of the requirement analysis, development stages, design decisions, and evaluation phases, concluding with a discussion of our basic findings. Requirements. Requirements for eLAT have been collected through literature analysis (Dyckhoff, 2010), as well as by informally talking to teachers, eLearning experts and system administrators at RWTH Aachen. The requirements analysis concluded the following main design goals: - Usability: prepare an understandable user interface (UI), appropriate methods for data visualization, and guide the user through the analytics process. - Usefulness: provide relevant, meaningful indicators that help teachers to gain insight in the learning behavior of their students and support them in reflecting on their teaching. - Interoperability: ensure compatibility for any kind of VLE by allowing for integration of different data sources. - Extensibility: allow for incremental extension of analytics functionality after the system has been deployed without rewriting code. - Reusability: target for a building-block approach to make sure that re-using simpler ones can implement more complex functions. - Real-time operation: make sure that the toolkit can return answers within microseconds to allow for an exploratory user experience. - Data Privacy: preserve confidential user information and protect the identities of the users at all times. Usability and usefulness: Every course is different, depending on the teachers and students who are involved in it. There are different teaching strategies, different learning goals, etc. Among the teachers there are some who have not used a Learning Analytics tool before, as well as advanced users. A Learning Analytics tool should be easy to use and understandable for all users. It must be usable for both: the beginner, who just looks at it for the first time, as well as for the expert, who already has a specific question and wants to perform deeper analysis. For beginners, a Learning Analytics tool should enable a direct entry and it should motivate to occupy themselves more with the underlying data, e.g., through a dashboards solution. Experts should find ways to explore and do further analysis to keep them well on the ball. Varying learning scenarios will also demand differing sets of indicators. An important future research task is to find out which indicators are useful for whom in what situation. Interoperability, extensibility, and reusability: Most existing Learning Analytics tools cannot be easily adapted for a different VLE. In addition, new e-learning systems are being developed that may contain useful data for Learning Analytics. Also, learning may take place on informal learning platforms. Therefore, an interoperable Learning Analytics tool that integrates with other systems, and can collect and analyze data from different platforms is required. Real-time operation: New issues on a course may arise at any time during a semester and should then usually be answered directly, so that timely improvements can be made. Also, new questions that are worth to be examined more closely, may arise during the answering process of ongoing questions. Therefore, a Learning Analytics tool should provide current data and comprehensive data analysis capabilities and be available at all times, not only at the end of the semester. Also, interactive analysis and visualization features, like filtering options for exploring the data in more detail, should deliver results and changing visualizations within microseconds. Data privacy: Personal data should be protected at all times to prevent abuse. Data privacy acts ensure such protection (e.g., Directive 95/46/EC, 1995; Federal Data Protection Act, 1990). However, an exception is made for teaching and research projects, under the condition that the data will be handled transparently and purposefully (Federal Data Protection Act, 1990). In addition, students or a data protection officer could be asked to consent to the collection and analysis of student data. Many questions regarding teaching, however, do not aim to examine records of individual students. Rather, data of the totality of students or subgroups with specific characteristics are interesting for drawing conclusions on learning processes. Data could be stored and processed pseudonomized to protect the users. As further protection, the tool could ensure that certain kinds of analyses cannot be executed in certain situations, where they would lead to the identification of individual students. Development stages and evaluation methods. eLAT was iteratively and incremental developed within two main stages that partially overlapped to meet the requirements described above: (stage 1) the implementation and testing of a backend framework as well as (stage 2) the design and evaluation of a UI (frontend). In the first stage eLAT was designed as a prototype to evaluate different software architectural approaches for Learning Analytics using different VLE platforms. During winter term 2010/2011, we selected four courses that were using the learning and teaching portal L²P of RWTH Aachen University. The courses differed in course sizes (1370, 338, 220, and 38 registered students), learning technologies and teaching styles to ensure realistic usage scenarios. We logged the students’ activity, interaction and (in one case the) assessment data over the duration of three months. By using the data of real courses, it was possible to learn more about meaningfulness of already implemented indicators and to let the teachers of the courses participate in the development process of eLAT. In this way, we could get immediate feedback and comments on prototype stages that already processed analytics based on the real data. The design and evaluation of a UI (stage 2) started parallel to the first development stage described above. It was iteratively conducted as well, whereat each of overall three iterations had a specific objective. The first iteration dealt with the collection and definition of the content. Since eLAT was designed to enable teachers to explore educational data of their students and courses based on graphical indicators, it involved the collection of indicators as well as assigning priorities to them. Thus, semi-structured interviews were conducted to evaluate a set of graphical indicators and to get to know further user requirements. Semi-structured interviews are used to collect facts, opinion and attitudes of the interview partners (Naderer, 2007). The interviews are guided through prepared questions, but it is also possible to ask questions spontaneously to investigate interesting details (Lindlof and Tylor, 2002). The second iteration focused on the layout and data presentation of the UI. The evaluations of the first and second iteration were performed with the help of paper prototypes, while in the third iteration a functional UI, which was implemented based on previous evaluation results, was used to investigate interactivity and usability aspects. Layout and data presentation were designed and evaluated in terms of heuristic evaluation, cognitive walkthrough and pluralistic walkthrough. A heuristic evaluation uses approved usability principles or guidelines to investigate the usability of a UI. Thus, problems can be discovered with less effort in an early development step (Nielsen, 1992). A cognitive walkthrough is more formal than heuristic evaluation. It needs a specification of the UI and tasks to evaluate the usability. With the help of the tasks the UI can be processed step by step to discover usability problems (Polson et al., 1992; Dix et al., 2004). Both methods were chosen to evaluate the prototypes of the UI in an early stage of development. The pluralistic walkthrough is similar to the cognitive walkthrough. It is a meeting of experts of different domains, such as users, designers, and usability specialist. They discuss elements of the interface prototype according to the view of the users (Bias, 1994). We used a variant of the pluralistic walkthrough where a domain expert, a usability experts and the designer discussed the interface from the users’ perspective. Main results of these studies are presented in the section “User interfaceâ€. The third iteration, which was mainly concerned with interaction, included a qualitative think-aloud study. Here users were asked to perform tasks with a software prototype, whereat they were talking about what they were doing and thinking. During the tasks the evaluator observed them. This method was chosen to identify areas of interactions where users can make mistakes (Dix et al., 2004). In the following sections, we present the resulting UI, use cases, design and implementation details of eLAT as well as overall evaluations results. User interface. The structure and layout of the eLAT user interface (UI) are the result of an iterative approach and were derived from our user studies, which have been elaborately discussed in Bültmann (2011). The UI is designed as a launch pad, which is similar to a dashboard but provides more comprehensive analysis options, additional to an initial overview (Few, 2006). A monitoring view helps to observe several indicators at once (figure 2). Furthermore, analysis views provide a deeper insight into the data of chosen indicators by making it possible to drill down into details by changing parameters of an indicator. Additionally, a mouse over effect shows details about the currently regarded information (figure 3). In the monitoring view, the content of the launch pad is grouped into four widgets. The widgets are containers for indicators related to the categories “document usage,†“assessment/ performance,†“user activity†and “communication.†Each indicator has its own tab in the widget. This hierarchical layout is supposed to help users to get a better overview about the current learning situation. By using widgets and tabs it is possible to put all indicators on one single screen. This concept also helps in terms of personalization, because widgets can be arranged flexible by the users. Figure 2. Monitoring view of the eLAT user interface. Figure 3. Analysis view of the indicator “Activity behaviorâ€. The analysis view of each indicator is consistently accessible in the corresponding tab by clicking “Configure indicator details†(figure 2). This detailed view of the indicator is shown as an overlay on top of the monitoring view (figure 3). Layout and functionality have been designed in a consistent way to gain a better usability (Few, 2006). On the right side of the analysis view is a filtering menu. The filtering of the presented data is context-dependent according to the currently selected indicator. For each tab in the filtering menu of any indicator, the user can determine which information the indicator should present. Hence, there are many options for data exploration, such as, comparing the activity of male and female users or the activity of students of different study programs. But not all filters can be used for each context. Because of data privacy regulations, we cannot allow the use of any kind of user properties like gender or study course, when there are less than a certain number of students, e.g. five users with that certain property in a course. The following paragraphs give an overview about six implemented indicators, which were rated to be interesting as result of the evaluations. Figure 3 shows the analysis view of the indicator “Activity behaviorâ€. Student data is divided into the three groups “very active students†(blue bars), “active students†(red bars), and “inactive students†(yellow bars), and shows their weekly distribution over a time span. An “active student†is determined by calculating the average number of days per week, at which a student was active, i.e., logged into the system. In the current configuration of the indicator “activity behaviorâ€, shown in figure 3, a student is defined to be “active†if he or she logs in at least once a week. A student is defined to be “very active†if he or she logs in on more than five days a week. The user can change the time span and the definition of an “active student.†Figure 4. Indicator “Access and accessing studentsâ€. The data in figures 3–7 is based on a programming course, which was finished with a final exam at February 8th 2010. As expected, the indicator above shows an increase of “inactive students†after the exam date. The indicator “Activity behavior†(figure 3) indicates whether continues learning is taking place. A participant of our semistructured interviews considered continues learning as a main factor for good exam results. As a sign of continues learning, the teacher might e.g., expect his students to log in at least twice a week to download new materials and stay up-to-date related to course information. The indicator “Activity behavior†can show tendencies of increasing or decreasing numbers of such active (groups of) students. High numbers of inactive students during the semester could bring the teacher to motivate his students to learn more regularly, e.g. through creating weekly exercises, or initialize further investigations on the reasons of low activity. The indicator “access and accessing students†(figure 4) supports the teachers in monitoring the overall online activity of their course. It shows the number of accesses/clicks (blue line) over the number of unique students (red line) who accessed the virtual learning environment during a time span defined by the user. The blue line represents the sum of every single click on any resources in the learning environment per day or week. It is important for a teacher to observe if e.g., a small group of students clicks many times or many students click once on a resource. The data in figure 4 shows that almost every day about a third of 278 registered students accessed several resources. The peak at the end of the timeline demonstrates a strong increase in accesses before the final exam, but only a small increase in accessing students. Lines converge after the date of the exam. Probably, the students only come back to the virtual course room to check the exam results (one click per student). The indicator “Access and accessing students (figure 4) can show outliers of usual access behavior/frequency. Teachers can relate high or low usage e.g., to teaching events or holidays. They can quickly observe if changes of learning materials or didactics lead to changes in overall usage behaviors. This might motivate them to experiment with didactics to improve the overall access to the learning environment. Figure 5. Indicator “Activity areasâ€. With help of the “Activity areas†indicator, shown in figure 5, teachers are supposed to identify whether and when students are accessing which parts/areas of a virtual course room per week in a defined time span. Hence, access rates between functions, like wiki pages or discussion forum, can be compared and related to teaching events as well. The x-axis of the indicator shows the days or weeks. If metadata on dates of course events, like the occurrence of specific lectures or the exam, are provided, these events can also be written on the x-axis. The y-axis records the number of students, who were active, i.e., clicked on resources, during that day or week in a specific part of the virtual course room. The red line in figure 5 e.g., shows the number of students accessing a document library with learning materials, such as lecture scripts and exercises. The red line and the yellow line, which represents the number of students, who accessed the discussion forum, peak out 1–3 day before the exam. Students seem to become more active in reading and discussing during that time, so that the case could be made that they are learning more intensively. Figure 6. Indicator “Top 10 resourcesâ€. The “Top 10 resources†indicator (figure 6) gives an overview about the most accessed materials. It can help to identify active documents/items that have been accessed more than others. Such a popularity indicato"
Acerca de este recurso...
Visitas 335
Categorías:
0 comentarios
¿Quieres comentar? Regístrate o inicia sesión