Enjoy this follow up to last week's blog post, Data, No Big Thing (Yet...) by Myk Garn.
The first time I heard the conjunction of precision and learning was in an article by Gus Evrard at the University of Michigan. In it he cites a 2013 Nature Magazine article wherein Alla Katsnelson noted a shift in the lexicon of modern clinical medicine; the framework once known as personalized medicine had morphed into precision medicine. “Might educators soon be following suit?” Professor Evrard wondered. IMHO--I think we will.
Personalizing education is nothing new. But personalizing at scale (meaning with groups larger than 25) is both a need--and a challenge. A need because, through the infusion of technology, we can control costs better and challenge because it means systemic shifts (and likely disruptions) to our current instructional models. As education moves from its ‘golden era’ of industrial, mass production of leaders and workers to a new century of learning ecosystems mass customizing education that maximizes performance and proficiency for every learner--we can expect to see many operations, and increasingly interactions, digitized.
This shift, from low-ratio manual to high-ratio digital personalization, is being driven now, in part, by our need to make education more affordable--especially for learners who are traditionally under-funded and underserved. But, just as the practice of precision medicine is being driven by extensive laboratory and clinical data that now can be parsed, mined and focused into solutions for ever more finely grained categories of individuals--the possibility of precision academics will require much greater streams of, and much more finely detailed, data about and from learners--in real-time--to inform and power machine learning platforms such as IBM’s Watson from which a virtual teaching assistant has been fashioned at Georgia Tech.
Building the Academic Genome
But, that’s getting ahead of our story. In 1988, when the concept of the human genome was first articulated, there were 280 DNA-based patents awarded. By 2000, when the rough draft was completed there were 3,828 patents awarded and, by 2003 when the genome was completed, over 17,139 patents were awarded or were pending. This explosion of DNA-based patents has moved whole areas of medical research from trial and error within the black-box of the lone practitioner to one grounded in, and powered by, accurate digital data that is a force multiplier for high-powered analysis and action. So, is there a correlated development going on in education? Yes, there is. It’s currently called competency-based education. The explicit specification and linking of competencies, content, activities and assessments--in digital learning ecosystems--is building into an ‘academic genome’ wherein the accumulated analog proficiencies and practices of the individual classroom can become the digital assets, algorithms and processes of learning scientists, instructional strategists, faculty¾and a new generation of learners.
The Four P’s of Precision
The heuristic that describes this digital imperative for education was well stated by Andrew Luna in 2012, “If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it. If you can’t improve it, then why the heck are you doing it?” To measure, control--and transform--higher education the data streams we work with will have to be pervasive, personalized, progressive and persistent. Pervasive, meaning we need a lot of data coming from every possible source. Just as we talk about ubiquitous connectivity where ever we go--the more our academic data stream can be captured and analyzed, the more can be done with it. Personalized data, that characterizes and contextualizes each learner--as an individual--is essential to building precise, actionable outputs and feedback. The data must be progressively collected and kept, connecting performance, feedback, advising from one course with another. And the data must persistently follow the learner during college and into the workforce and throughout her career.
Are We Ready for Digital Reinvention?
Only somewhat. We are only at the very beginning of even conceptualizing what the use of such data will do for education. But, it is unquestionable, that we are, and will continue at an increasing pace, developing the means to digitally model and manage learning in new and powerful ways.
There are those who worry this is yet another attempt to standardize, proscribe and control what faculty are allowed to do. And, while there is a legitimate argument to be made that moving from proprietary, single-practitioner, master and apprentice instruction to data-rich, networked, shared models of learner ecosystems can increase efficiency--the view through the lens of quality and efficacy is more relevant--and promising. By aggregating and linking competencies and summative assessments across a curriculum the entire purview of faculty to identify and use content, activities and formative feedback can now be actually freed up and informed with robust, real-time performance data that can guide experiments, improvements by learners and faculty alike.
Precision Academics will be a force-multiplier for digitally-driven, mass personalization of learning for--and by--the learners.
How Do We Get Started?
By ensuring your curriculum design strategy, technology stack and data policies enable generation and capture of learning-level data (in as near real-time as possible), sharing of data between platforms, analysis that is both immediate and longitudinal over time, and output is informative and actionable. This may require new design standards for curriculum development that link with and build in additional data-generating activities e.g. from supplemental tools such as SoftChalk, ancillary platforms like WebAssign and other nodes in the learner’s academic ecosystem. Data security policies will need to be harmonized and balanced against the positive benefits sharing can bring to individual learners. Upgrades to existing platforms e.g. adding an analytics package to existing learning management software, or conversion to new platforms designed to share, aggregate and analyze data may be needed. In some instances, custom middleware, or agile data integration platforms may provide solutions. Completely new platforms i.e. customer relationship management software, also need to be considered when trying to make data persistent--and usable across the student’s journey. Design of sophisticated algorithms behind the scenes that lead to simple, intuitive dashboards for learners, faculty, staff, administrators will (hopefully) bring elegance and efficacy to what is anything but a trivial endeavor.