5 min read
I'm trying to understand the buzz behind Learning Analytics.
The syllabus for the Learning Analytics Online Course contains some useful links and resources that provide background on the subject. According to the syllabus, the course lays "a foundation for the upcoming 1st International Learning Analytics and Knowledge Conference held in February, 2011 in Banff, Canada.
From the syllabus
The growth of data surpasses the ability of organizations or individuals to make sense of it.
This isn't really new. It also begs the question: what data points (aka, individual human behaviors) are actually valuable to measure? And are the same behaviors predictably and/or uniformly valuable to all individuals across all contexts? I suspect that the answer is "no."
Also from the syllabus
In an age where educational institutions are under growing pressure to reduce costs and increase efficiency, analytics promises to be an important lens through which to view and plan for change at course and institutions levels.
Money is certainly tight everywhere. However, the claim that crunching data will improve the process of learning to the point where it saves a significant amount of money and time feels analogous to a personal jet pack: it could happen eventually, but claims of a straight line from from where we are now to that promised land seem hyperbolic. What specific data points are needed to actually make this valuable to a school or company? More importantly, why should learning require a significant loss of privacy, as embodied by a third party collecting, storing, and analyzing a person's behavior over weeks, months, or even years?
As a side note, it's interesting that Google is probably in the best position to do this type of analysis, as school districts, universities, governments, and companies are all tripping over each other in their rush to throw their user data into the data abyss that is Google Apps.
Of course, companies have long mined their data to improve sales and productivity. But broadening data mining to include analysis of social networks makes new things possible. Modelling social relationships is akin to creating an âindex of powerâ, says Stephen Borgatti, a network-analysis expert at the University of Kentucky in Lexington. In some companies, e-mails are analysed automatically to help bosses manage their workers. Employees who are often asked for advice may be good candidates for promotion, for example.
In this use of data mining, the analysis helps highlight individuals who are, in some way, effective. I'm assuming that the learning/productivity patterns of these individuals can then be mined and compared to discover common patterns. However, the level of detail required for meaningful analysis could quickly run up against privacy concerns. From the perspective of a company, knowing what their top performers are doing every minute of every day would be valuable information, but I suspect that the level of scrutiny needed to make this data valuable would be pretty intolerable to the people being scrutinized.
It's also worth noting that the information returned by this initial analysis is a person, which has some interesting implications for teacher professional development:
"Well, Ms. Jones, your students all did well on their tests, parents report positive interactions, your classroom observations were flawless, but your social graph was quiet between November and January."
When the method for analysis is usable, it will compliment existing processes inside formal education, such as Recognition of Prior Learning (RPL), or assessment of prior learning (APL) that theoretically help people find accelerated pathways through curriculum. This really only works where assessment is standardised, such as the national unit standards used in Australia or New Zealand. HE resist this standardisation, making RPL and APL impossibly inefficient.
The ultimate would be a method that assists people to do their own LA and develop evidence for arguing for more precise RPL or APL.
I imagine this tool as a browser add on, something that could collect, track and save communicative and informative activity.
Learning Analytics, at least from the definitions I have been able to find, makes the most sense for and is most useful to organizations. Learner control, and ensuring that learners reap the benefit of being hyper-observed, deserves greater consideration.
It's also worth noting that putting time into analyzing how people learn does nothing to improve a more fundamental problem that could help improve questions asked via better data analysis: how can informal/non-traditional learning be assessed and credited within formal education/professional development? Weaknesses in how competency and knowledge are assessed need to be part of any comprehensive system that examines how learning can be more effective. It's difficult to make the case that we need to make the processes by which we learn more effective when we haven't yet mastered how to assess how people are learning now. If people have some examples of meaningful assessment of informal learning within traditional environments, please let me know about them in the comments.
Existing Practice, New Branding
Learning Analytics seems like a blend of:
- Traditional forms of data and behavioral analysis; combined with
- Making use of the increased processing power of the cloud; combined with
- An increased understanding of what the semantic web is, and how it works; combined with
- Better search tools, allowing less tech-savvy users to craft more sophisticated queries through larger data sets stored in the cloud, combined with
- Better visualization tools to make interpretations of these data sets more intuitive.
It's a powerful set of tools, and as a construct it's interesting, but I'm definitely missing what makes it novel. However, there are a lot of smart people who are interested in it. Please feel free to tell me what I'm missing in the comments.