Catching bias in data solutions
Can artificial intelligence predict who will drop out of high school? Laura Hinton sees both the promise and the peril hidden inside this question and others like it.
Educators today have an unprecedented trove of data about learners, mountains of student information they can analyze for insight, interventions and even predictions. “But what happens when we just bring our algorithms over to education,” says Hinton, “without fully understanding who they are impacting—and how?” She points to parallels in other fields, such as the work of the Algorithmic Justice League, founded by Joy Buolamwini, whose research has revealed racial biases in facial recognition technology. “We’ve seen how a lack of diversity in tech is a big part of that.”
Hinton – who earned a sociology degree at Harvard, taught briefly in Detroit and completed a data-driven fellowship for the Kapor Center in Oakland – is a member of the inaugural cohort of the GSE’s new education data science master’s program, an interdisciplinary program that she says felt tailor-made for her interests. “It’s exactly what I was looking for: the intersection of education with data science.”
Intersectionality isn’t an abstract concept for Hinton; it’s part of her lived experience as a Black, gay woman. Surveys find that only 15 to 22 percent of data scientists are women, she says, and far fewer are Black. Hinton knows her perspective will be useful for catching bias in data solutions, whether she goes on to pursue a career in ed tech, public policy or nonprofits. “I’m keeping an eye out for equity at the forefront.”