Teachers and school administrators are curious about the capabilities of technology in relation to learning. What is software supposed to be doing for the process of education? In the piece-meal state of various slices of content, different systems of courseware, collections of online books, assessments, grades, teacher results, financials, human resource information and more, it’s easy to lose all sense of direction in single- function data.
There is a scale of progression to any data analytics, but a focus for the times is the student-centered information leading to the ability to prescribe specific learning. This is the personalization nirvana that so many schools are aiming for, but getting there is another matter. The first question by teachers and leaders is, “What is happening with Learning?” This leads to the querying and searching for simple report data on grades and scores. This is the lowest level of inquiry. The next level up is analysis of the data in comparison with the same sorts of data, looking at that data over time, and reviewing simple trend graphs. The final level in the “What’s Happening” arena is a continual monitoring of the data with an automatic digital dashboard to visualize the data and provide for an auto-personalization by teachers in manual processes. Lots of professional development workshops focus attention at this use, causing teachers to review analysis and then direct classroom activities that provide for adaptations of lessons to fit the groups or sub-groups of students. In some instances, the slowest achievers are mentored one-on-one in class while the other students are in group activities.
At the monitoring stage is when schools first start to see the utility of courseware that automates locating the lack of understanding and remediating it with content and questioning until the student passes the concept. At this stage schools generally still keep the students in a group by class and grade, gating the software to hold students in that pattern rather than letting some sail forward in the content to higher levels. Teachers unfamiliar with courseware will use it as supplemental to core content, as a sort of reward or practice. If they also use independent lessons built with or without a framework system to make courses like a Learning Management System (LMS), the picture of what is happening with students becomes dispersed. Multiple courseware log-ins, multiple apps, other systems, complicate the picture and reduce the ease of monitoring. Coupled with new goals of personalization, this complexity often fuels the deeper question of why some learning is not happening, while some is.
The next question on the scale is “Why?” This is where diagnosis of student data across time comes in. At this level, schools are trying to unravel why a student, or a whole class is stumbling in any subject. They seek to pinpoint earlier omissions and fix a learning issue with some sort of intervention. The “Why?” question is also the genesis of cross-tabulation with other types of data that might point to reasons. These include social-emotional trauma, teacher skill, and more. This overall diagnostics stage has schools looking heavily at their data collection and use. Leaders will be concerned about systems interoperability, so they can make a true diagnosis with all the data. Finding the right why of learning problems is what leads to real achievement.
From loads of diagnostic data accumulating in schools, and corrections for students, the natural next stage is to wonder what is likely to happen for any one student, class, grade, or whole school. Leaders see that they need to be able to predict what’s likely to happen so that they can act earlier. Many digital courseware systems already come loaded with probability dashboards as to student completion timelines and other inference from their digital activity. The idea of using a lot of data is an industry trend called “Big Data,” which may not actually be tons and tons of data, but is more commonly a referral to the use of various data sets to provide predictive analytics.
The final question in the maturity scale of education analytics is “What Do I Do?” This question builds on the predictive levels of analytics to prescribe paths of action. Recommendations based on the topic associated to the data, or on a topic tangential, are the goal. At the institutional level, this may be answering the question of “where do I open another school” from combining population growth data with existing school geographic locations, economic information and more. The data against a map can show exactly where it makes sense to position a school, such as in the work of the company, GuideK12.
At the learning level, advanced courseware is providing initial assessments to insert students into correct levels for their learning and embedded ongoing formative assessments that automatically prescribe pathways. Framework systems allow this work to be done by individual teachers while providing some diagnostic data for teachers to prescribe the next lessons. The ultimate in prescriptive personalized learning will come when systems are so integrated, and content repositories so deep, that algorithms can be built to learn student needs and not only guide them in core subjects, but recommend subjects totally outside of those for deeper understanding. An example of this might be the study of the foundations of math by the Chinese and Egyptians with a culture study for a math wiz-kid.
Already the technology industry is pointing to the future of what is called “Big Learning.” This will be a stage when the convergence of data has met with “machine learning,” where machines are learning from other machines to provide more valuable inference back to human users through artificial intelligence.
*This Chapter, Primer on Education Analytics was taken from Learning Counsel Special Report ANALYTICS AND ONBOARDING. For the full Special Report click here: http://thelearningcounsel.com/paper/analytics-and-onboarding