Learning Analytics Tool Evaluation
Learning Analytics Tool Evaluation
My goal was to develop a framework for assessing effective digital learning tools that can assist with engagement, course pacing, and critical thinking. This artifact involves the investigation and assessment of an application (D2L Insights) that claims to improve student engagement. This artifact represents my ability to seek out a tool that can achieve what I am looking for, as well as my ability to critique it using research or proven frameworks (EFLA).Â
This artifact demonstrates my understanding of engagement from multiple perspectives, such as how an administrator would utilize tools to help in course design and management to increase engagement (as opposed to student-facing engagement tools). It also shows that I am learning how to apply different frameworks, such as EFLA, to my evaluation of learning applications and tools so that my assessments are research-based.
Ultimately, I included this artifact in my portfolio because it shows my ability to discern whether or not a digital learning application can do what it claims to be able to do. With this particular application, I came to the conclusion that it was not as useful in terms of increasing engagement as advertised. My aim is to highlight my ability to seek out those applications/tools that will be effective in improving outcomes, such as engagement, and be able to separate them from those that cannot through sound methodology.
This artifact connects to the metaphor of opposing forces because it brings forth the tension between Data and Application. The claim of the application is that the data can lead to the desired outcomes (increased engagement) but this critique questions whether it does this in practice. The application claims that the data collection tools (and the data itself) promote student learning, however, some tools/data are just there for the sake of the technology, i.e. a tool that collects data on student platform usage but does not provide insights or actions based on that data. This can be compared to learning environments where there is no data at all, which is not ideal either. Having data just for the sake of having it versus the notion that no data at all can leave educators in the dark, leads to the understanding that we need a balance, that being useful, actionable data.
This was not a difficult process for me because I was already very familiar with the D2L platform and had recently begun a trial with the Insights application for my own school. As such, I already had a sense of what data was available within the platform; I was looking to see what D2L could do now with this data inside their new application. The Evaluation Framework for Learning Analytics (EFLA) was relatively new to me, however, so I began this process by familiarizing myself with it. Once I felt I had a handle on it, I applied the framework directly to the application with relative ease.
I feel I demonstrated achievement of this goal through this artifact since the key was to find an effective framework for evaluating specific digital learning applications/tools. While I did not develop my own framework for evaluating D2L Insights, I used an existing one that is in all likelihood more useful than one I could have created myself. I was able to assess this application through the lens of a specific outcome (student engagement), which is also a crucial element to this goal. This artifact is evidence that I am capable of making sound assessments of digital learning applications and tools.
Using the Evaluation Framework for Learning Analytics (EFLA) was very effective, thus I will look for other known frameworks when assessing different learning tools in order to situate that assessment properly. There are many frameworks and assessment tools aimed at specific outcomes that can be harnessed in a similar way, such as the Online Student Engagement Scale (OSE). Going forward, I will assess other digital learning tools similar to the way in which I assessed D2L Insights.