Ithaka releases report on “Student Data in the Digital Era”

From the abstract:

 Individual researchers, higher education institutions, and other organizations working in these areas are often hindered by challenges related to technical and analytical capacity and institutional culture, as well as sorting out what it means to collect and use data responsibly. Many have deferred or abandoned efforts in the face of these obstacles. Addressing these challenges, and achieving the potential benefits of the new student data will require a set of guiding principles, coordination within and across institutions, and enhanced technological infrastructure.[1]

To provide an overview of this landscape, we reviewed initiatives in three broad categories:

  • Research: Student data are used to conduct empirical studies designed primarily to advance knowledge in the field, though with the potential to influence institutional practices and interventions.
  • Application: Student data are used to inform changes in institutional practices, programs, or policies, in order to improve student learning and support.
  • Representation: Student data are used to report on the educational experiences and achievements of students to internal and external audiences, in ways that are more extensive and nuanced than the traditional transcript.

Comcast understands the value of privacy, might charge for it

Comcast is deliberating on asking customer to pay for privacy. Comcast defended a pay-for-privacy pricing structure in a letter to the Federal Communications Commission. AT&T already offer a service called “Internet Preferences” that provides lower monthly rates to customers in Austin, Texas and Kansas City who allow AT&T to track their web history and other behavioral data that can then be used by advertisers. Users who opt-out of “Internet Preferences,” which DSLReports calls a “deep packet inspection program that tracks your browsing behavior around the internet—down to the second,” face a $30 premium on their monthly bill.

Neo-liberal Reform and the Big Data University

Andrew Feenberg has taken issue with the “neo-liberal agenda” that is currently guiding how far too many universities both conceptualize and use “educational technology.” In this article, I expand the scope of his critical discussion to include analysis of contemporary higher education initiatives that capitalize on big data.

Libraries, Neoliberalism, and Oppression

At the end of Libraries and the Enlightenment, I suggest that libraries are places “where values other than the strictly commercial survive and inspire, places people can go, physically or virtually, and emerge better people, their lives improved and through them perhaps our society improved.” The key is “values other than the strictly commercial,” because I think public and academic libraries are examples of public spaces where commercial values don’t dominate. They are public goods founded upon the values of democratic freedom and critical reason and provide a possible location within society to promote and protect anti-neoliberal values. Librarians in general are committed to open access to information and education. As Barbara Fister just wrote, they are gatekeepers who want to keep the gates open.

The Neoliberal Library: Resistance is not futile

Neoliberal thinking tells us a successful reference “transaction” provides the patron with the most efficient answer to their immediate information need. Neoliberal thinking mocks the idea that library instruction and reference might be about encouraging students to think critically not only about their own information consumption but also about the whole system of knowledge creation & access, and about who controls how we search and what we find. Neoliberalism scoffs at the idea that librarians ought to encourage browsing and serendipity and other forms of “inefficient” research and learning.

Neoliberalism frames this as a contrast between giving patrons what they want vs what giving them what we think they need. That formulation is a rhetorical strategy that makes librarians sound like condescending bunheads who aren’t hip to what the kids need.

What I want to suggest is that we can and should resist that rhetoric – both because it is incredibly sexist and ageist and because the tension is not between what our patrons ask for and what we want to give them; the tension is between a neoliberal, transaction model of library services and a model based on the mission of promoting critical thinking and equipping students to interrogate power and authority.

Critical Algorithm Studies: a Reading List

This list is an attempt to collect and categorize a growing critical literature on algorithms as social concerns. The work included spans sociology, anthropology, science and technology studies, geography, communication, media studies, and legal studies, among others. Our interest in assembling this list was to catalog the emergence of “algorithms” as objects of interest for disciplines beyond mathematics, computer science, and software engineering.

Campus Support Systems for Technical Researchers Navigating Big Data Ethics

Complex data sets raise challenging ethical questions about risk to individuals who are not sufficiently covered by computer science training, ethics codes, or Institutional Review Boards (IRBs). The use of publicly available, corporate, and government data sets may reveal human practices, behaviors, and interactions in unintended ways, creating the need for new kinds of ethical support. Secondary data use invokes privacy and consent concerns. A team at Data & Society recently conducted interviews and campus visits with computer science researchers and librarians at eight U.S. universities to examine the role of research librarians in assisting technical researchers as they navigate emerging issues of privacy, ethics, and equitable access to data at different phases of the research process.1

Blowing Off Class? We Know

Tools developed in-house and by a slew of companies now give administrators digital dashboards that can code students red or green to highlight who may be in academic trouble. Handsome “heat maps” — some powered by apps that update four times a day — can alert professors to students who may be cramming rather than keeping up. As part of a broader effort to measure the “campus engagement” of its students, Ball State University in Indiana goes so far as to monitor whether students are swiping in with their ID cards to campus-sponsored parties at the student center on Saturday nights.

Predictive Analytics: Nudging, Shoving, and Smacking Behaviors in Higher Education

Nudging opens up risks on opposite extremes linked to data and how data is used. The first risk is the danger of ignoring variances in data. Valuable data elements that may impact our understanding of the underlying phenomenon and the design of the intervention — elements such as diverse information that is difficult to capture — can be overlooked. Second, on the other extreme, academia may be flirting with discrimination by using group attributes to generalize patterns across individuals who might have features connecting them to one or more categories. Algorithms pick out data points that make up a small (e.g., high school GPA, major, hometown, residence, financial aid status) or large (e.g., race, socioeconomic status, marital status, gender) portion of an individual’s experience, but should these data points become a factor in the types of nudges used?