So, in this blog I’m going to discuss the ethical considerations and implementation of learning analytics in a corporate training context. This specifically leads on from the flipped learning project design I’ve discussed previously. The discussion is therefore focused on my particular context, but I think the overarching ideas relating to ethics and implementation can be considered in many other contexts and fields.
So, to introduce the context, the learning analytics implementation is for the British Council Thailand’s Professional Training Centre (PTC) in Bangkok. I’ll detail the primary ethical considerations and recommendations on good practice in the implementation of learning analytics in the development and use of the new flipped learning department product in a Thai corporate training context.
What are Learning analytics?
Learning analytics can be defined as how data about learners is measured, collected, analysed, and acted upon to optimize learning and the learning environments in which that learning occurs.
Learning analytics may provide us with the means to bridge the gap that relates to a number of potential issues such as students lack of engagement, motivation, and difficulties with online materials as well as the fact that the teachers do not have the visual and interaction cues that signal those difficulties. With learning analytics we could identify those at-risk students or those that are not engaging with the materials to plan interventions that will support their learning journey.
Analytics will be used in this context as part of the online aspects of the flipped learning courses to identify whether learning designs are being adhered to. This will ensure learners are prepared for the in-class sessions. Through the analysis of learners’ habits, actions, interactions, failures and successes in their use of the LMS, the content, and devices used, the PTC can make predictions of learners’ requirements, and how to improve materials, communication, and access to inform the development of our courses going forward. This will make our product more effective and therefore we will become more competitive and increase our likelihood of receiving future business from clients.
The introduction of learning analytics and therefore the flipped learning product itself will not be successful if:
- there is no buy-in from students and educators, they must appreciate that it complements the teaching and learning processes
- responsible parties do not have sufficient time or training to use it
- learners do not have sufficient time to study the materials
Also, we cannot measure all online learning undertaken by prospective students. Some of that learning will happen outside of our LMS on websites and social networks that we may not be able to extract data from.
What are ethics in Learning Analytics?
Ethical issues for learning analytics fall into three overlapping categories: where data is located and how it is interpreted; informed-consent, privacy and de-identification of data; and how it is managed, classified and stored.
What makes data in a learning context unique and distinct from data’s use in marketing, for example, is how it relates to moral practice, the identification of students as developing participatory agents in its collection and use, and the necessity for being transparent in that use.
Areas of concern
Areas we need to be mindful of can be:
- Mislabelling students based on incomplete, incorrect, or inaccurately collected information
- Not considering the factors relating to students’ personal lives, emotional states, social, and economic factors that are not observable
- Restricting avenues of learning to our materials and course alone in preparation for what may occur in the in-class elements, i.e., maybe a student prepares with books they have access to
Critically, in this Thai corporate training context, we need to consider the power-relations between all stakeholders, learners, teachers, clients, administrators, and management. We need to adopt a socio-critical perspective which necessitates being cognizant of the manner in which cultural, political, social, physical and economic contexts in Thailand inform our decisions in learning analytics. This should naturally apply also in any context. Put simply, think of who you’re dealing with and consider the culture in which your work is being applied.
Considerations for introducing / using analytics
Learners’ expectations and perceptions must be managed carefully. Their engagement with online materials should be engendered by the learning design and not the threat of failure on the course or the notification of their superiors of inappropriate or incomplete use of the materials such as skipping quickly through lesson pages.
This type of surveillance atmosphere may result in demotivation and resentment, potentially affecting future revenues. It will be necessary to personalize reporting that can be understood by learners and clearly relate to the enhanced effectiveness of their learning. Transparency and the opportunity to provide qualitative feedback is a requirement.
In our flipped learning online designs we will try to promote continued learning by adopting the principles of Connectivist learning theory, where we look to foster within learners the appreciation of their finding and becoming a part of networks of specialist connections where they can source and provide information. We must, therefore, consider how this portion of learning is ensured.
Generally, the learning designs might reflect this in work then conducted in class such as the learners presenting what they learned and found online or providing evidence in printouts etc. as instructed by specific tasks that pushed them to look beyond the boundaries of the e-learning environment and into the wilds of the Internet proper.
However, we can also look to employ web-forums so that learners produce and reflect on the evidence there. Analytics can produce information as to whether this has been done. However, some learners will find these activities more difficult due to their English competency. There is also the contextual factor of face in Thai society which may make these kinds of interactions, where their work is out there for everyone to see, difficult.
Finally, whoever is chosen to interpret the accumulated data needs to understand the context of that individual learner at that point in the course and how their interpretation of the data and their resultant actions have ethical consequences. Essentially again, we have to consider that our learners are people and our actions might have far-ranging consequences for them.
It is likely that any organisation will have to be judicious in what information is shared with clients. Learners will have to be made aware of what is collected, why, what it is used for, and what will be provided to their superiors.
It might be that the interventions or lack thereof would be based on shared characteristics or trends in the cohorts. For example, if the cohort as a whole which will be involved in the in-class productive elements of the flipped learning courses are not engaging with the online parts of the course, then that may have to be raised with the client but not before we investigate whether there is an issue with the materials or the technologies used in its delivery.
However, interventions must be weighed against priorities. Do we maximise the effectiveness of our learning designs, or ensure profitability? We may run the risk of alienating learners with certain interventions and this may adversely affect our chances of getting clients to return to us in the future by virtue of negative feedback from learners provided to their superiors. On the flip-side, some employers may welcome that stringent approach. One might suppose that this would be something to be considered at the initial stages of discussion with the (prospective) client based on accrued information from past dealings or knowledge of their general working environment and policies. Information such as this is becoming more and more available through webistes such as Glassdoor.com where ’employees and former employees anonymously review companies and their management’ (Wiki).
Here are a few helpful resources that could help an organisation interested in applying learning analytics. I’ll explain their strengths and weaknesses as to show how they can be used, but also what needs to be considered in relation to that potential use.
- Good overview of considerations for the ethical use of student data
- Gives information on how personal information can be updated
- Details what tutors have access to and why
- Could be used as a model
- Does not set out how data might be secured
- Potential differences in learner/educator relationship between OU and other contexts
- No information on ethics relating to making data available to superiors which might impact learners’ progress in careers
- Attractively designed document
- Could provide a model
- Sets out principles of ethical use
- Sets out the shared responsibility of the student and the organisation for their learning
- Clients may desire a more stringent surveillance
- Courses generally tailored to individual clients
- Document would be expensive
- Detailed information covered in Resource 2
- Can be made available to learners and clients for deeper understanding
- Long and detailed
- Possibly unlikely to be read by most learners
Recommendations for good practice
Finally, based on what has been discussed above, I’d like to make some recommendations.
Ensure learners have full understanding of what is collected, why and the benefits afforded by it. It should also be clearly explained to them that they will have opportunities to provide feedback on this area of their course. This and the transparency of use is a potential antidote to the resistance that might be felt in regards to the interpretation of learning analytics as surveillance.
Involve teachers/trainers that will be conducting any in-class elements of the courses in the discussion regarding what analytics should be sought and used. Teachers should also have full participation in intervention processes as to give feedback on the individual learners’ in-class performance. This could necessitate training and would have to be built into the teacher/trainer’s schedule which might adversely affect profitability of the product due to staffing hours and might need to be figured into course prices.
As opposed to a purely administrative role, the responsibility of interpreting the learning analytics data should be someone with the relevant educational training and understanding of the materials and pertinent ethics. This might safeguard against someone jeopardising existing client relationships by being too systematic in their decision-making regarding what they perceive as actionable insights. Creating a role with specialized responsibilities might ensure that good informed judgements on interventions and client notifications of learners’ misconduct are made.
To conclude, the main theme that can be drawn from the above, perhaps, is that while learning analytics can be powerful tools in learning contexts, affording us to have a greater perspective and improved insights on the learning that may go on outside of the four walls of a classroom or training room, we must consider the consequences of how they are implemented and when we take action on what we see.
Comments as always are welcome.