Problems with Obligations of Knowing: A Short Response to Fritz and Whitmer

In their recent article published in New Directions for Institutional Research, John Fritz and John Whitmer push back against what they perceive as a “do nothing” approach to learning analytics. This post is a brief response to some of their more contentious points.


Introduction

Fritz and Whitmer’s piece begins with the position that—based on their professional experience and reading of the literature—many are taking up “a ‘do nothing’ approach as a way to assure we ‘do no harm'” with regard to learning analytics and the pursuit of student success in higher education. They acknowledge that there is a strong conversation surrounding learning analytics ethics, yet they state a sense of being “overwhelmed” by the breadth and depth issues needing consideration. They write, “such concerns, combined with the sheer time, effort, expense, experience, and expertise it still takes any institution to identify, collect, analyze, and act on its students’ learning data, might lead some to reasonably conclude ‘why bother?'”

Their argument is that ethics, in its myriad forms and calculations, should not hinder the progression of learning analytics. In fact, citing Campbell (; see also ), institutions have a different obligation, one to use and act on data and analytics. Moreover, they raise opposition to the idea that students, through their interests and privacy expectations, should have agency in the implementation of learning analytics. I find these positions to be morally suspect.

Paralyzed by Ethics

It’s troubling to read that higher education information professionals such as Fritz and Whitmer (themselves institutional researchers) see the body of ethical arguments as burdensome. Yes, the literature set is nicely developed (though there is always room for more analysis). And it is true that ethical conversations around learning analytics have not ebbed—they’ve stayed fairly constant, if not intensified. But these things are to be expected given that many writers continue to respond to new methods and measures created by learning analytics researchers, as well as when prominent learning analytics initiatives fall off the (ethical) tracks.

The great work in this area has been done by journalists, practitioners, and researchers in a collective attempt to inform practice and policy development, as well as technological design. The literature raises cases of learning analytics gone astray in an attempt to raise awareness. The authors cite one such case: the now-infamous “drown the Bunnies” snafu at Mount St. Mary’s University. It also attempts to provide some ethical guard rails through rigorous analysis and the development of guiding principles. The research literature in particular provides advice, guidance, etc. for how to address an array of ethical issues associated with learning analytics. Some of this guidance is general (e.g., “serve the interests of students”), while others attempt to prescribe ethical action (e.g., “be transparent with data sources and types used in service to learning analytics”). To characterize these extensive efforts as debilitating and not guiding is troubling—it reads as an excuse to not engage the material.

Institutional researchers—as well as other institutional actors—have a responsibility to handle student data and information with care and sensitivity. While they have a practical duty to make use of data, for it’s the lifeblood of their work, that does not trump their moral responsibility to carefully and purposefully make informed ethical choices. Ethics is not clear cut. It takes time and thought to establish moral positions, determine ethical practices, and codify such things in office and institutional policy.

The Easy Way Out

Fritz and Whitmer’s use the “obligation to act” argument as a defensive response to the wave of ethics research that they find is limiting their goals and the potential of learning analytics. Unfortunately, the authors do very little to actually engage the ethical thinking Campbell (and his colleagues) put into developing the “obligation to act” argument. Instead, they treat it as a dichotomous position to the current ethical landscape they find exhausting, which I think is an easy way out. Instead of acting slowly, or in some cases where justified, not at all, they argue that institutions must act simply because they have student data and that data reflects knowledge.

There are two major problems with this line of thought (though others will find different faults). First, it treats all data and the analytics derived from that data equally. That is to say, one data source is neither more or less harmful than another; therefore, the permit to mine and analyze that data is the same. Take the example of geolocation data gathered from student ID card swipes at dorms, labs, and other locations. One could argue that knowing a student’s movements and social network associations (as derived from a social network analysis of timestamp data of nearby associates) can help develop student retention models. Such models may plausibly lead to new retention intervention practices. The justification is that the data is known and those analytics can be developed; therefore, it should be acted on. But, that justification flies in the face of grounded, established, and defended arguments against personal surveillance—especially in educational contexts.

The second problem is that Fritz and Whitmer argue that an obligation to act should never be trumped by student preference and review, stating as much in the very last sentence of the article: “We fully support a measured, thoughtful, reflective, and even cooperative approach to doing so with students, as long as our initiatives are not dependent on students’ prior approval” . They write that providing students choice in their privacy in ways that may limit data collection and analysis “bequeaths some of the institutional responsibility for learning to the opinions of students,” arguing that such privacy-enhancing practices would introduce statistical error into their analyses. Why is it that the statistical validity of their models should be put before the human rights students expect? If so many students express concern about an analytic practice to the point that it impairs the model, then perhaps the model was morally corrupt to begin with? It’s clear in their writing that they welcome student input, but it’s also clear that student input has little value if they disagree with it.

About FERPA

There is one final response I want to make with regard to claims regarding the Family Educational Rights and Privacy Act (FERPA), the main student privacy law in the United States. Fritz and Whitmer lament that researchers and colleagues use FERPA as a privacy mandate and a barrier to data-driven educational research. Like them, I have encountered similar sentiments by faculty, and I have read about such claims by collegiate sports administrators when journalists seek student information. There is truth to this. But there is no truth to their argument that “FERPA obliges [emphasis added] institutions to study the experience of past students to improve the experience of current and future ones” . That is a novel and incorrect interpretation of the law.

A more appropriate and accurate approach to FERPA is that its design is flawed where privacy is concerned but enabling with regard to learning analytics. In fact, my colleagues and I wrote about FERPA’s learning analytics-benefitting loopholes in our forthcoming article in the Journal of the Association for Information Science and Technology (JASIST) . We wrote that the definition of a “legitimate educational interest” is defined by the institution. A student’s privacy rights depend in part on how such an interest is defined, including how data is collected, shared, and used. Therefore, an institution can create a sweeping definition of the interest, granting many different types of institutional actors (and third parties) access to student data for analytic purposes.

Our concern, then, shouldn’t be about using FERPA (wrongly) as a hindrance to learning analytics. Rather, we should recognize its flaws and step up as institutions to build new student privacy protections that a law developed in 1974 could never have prepared for in an age of analytics and big data. Exploiting its loopholes—and making incorrect interpretations of its purpose—is another signal that institutional actors haven’t truly considered the near and far downstream harms that learning analytics can create.

Concluding Thoughts

Maybe my writing will be perceived by some as another attempt to slow the cogs of learning analytics. I cannot change someone’s perception of my motivations. The point of this short response is simply to highlight the weaknesses I found in two authors’ approach to learning analytics ethics. In so doing, maybe others will be able to recognize and address some of the faulty arguments in the “obligation to act” argument—as laid out by Fritz and Whitmer—when they see them replicated elsewhere.


References

Campbell, J. P. (2007). Utilizing student data within the course management system to determine undergraduate student academic success: An exploratory study. ProQuest Theses and Dissertations, 1–219. https://docs.lib.purdue.edu/dissertations/AAI3287222/
Fritz, J., & Whitmer, J. (2019). Ethical learning analytics: “Do no harm” versus “do nothing.” New Directions for Institutional Research, 2019(183), 27–38. https://doi.org/10.1002/ir.20310
Jones, K. M. L., Rubel, A., & LeClere, E. (2020). A matter of trust: Higher education institutions as information fiduciaries in an age of educational data mining and learning analytics. Journal of the Association for Information Science and Technology, 71(10), 1227–1241. https://doi.org/10.1002/asi.24327
Prinsloo, P., & Slade, S. (2017). An elephant in the learning analytics room: the obligation to act. Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 46–55. https://doi.org/10.1145/3027385.3027406
Willis, J. E., Campbell, J., & Pistilli, M. (2013). Ethics, big data, and analytics: A model for application. EDUCAUSE Review. https://er.educause.edu/articles/2013/5/ethics-big-data-and-analytics-a-model-for-application

Kyle M. L. Jones

Dr. Kyle M. L. Jones is an associate professor in the Department of Library and Information Science within the School of Informatics and Computing at Indiana University-Indianapolis (IUPUI). Get in touch with Dr. Jones here.