Artificial intelligence technology (AI) makes modern life much easier—from tailoring the ads we see when we surf the web to allowing Siri or Alexa to turn on the lights before we get home from work.
But this growing automation also poses ethical problems, including discriminatory facial recognition, disinformation fueled by algorithms seeking attention and controversy, or built-in biases that affect everything from the jobs we get to the medical treatments we receive.
At the Computing Community Consortium Council, an arm of the Computing Research Association, Suresh Venkatasubramanian, a professor in the University of Utah’s School of Computing; together with colleagues Nadya Bliss, director of the Global Security Initiative at Arizona State University; Melanie Moses, a professor in the University of New Mexico’s Department of Computer Science; and Helen Nissenbaum, a professor of information science at Cornell Tech, have been working to figure out how to resolve AI’s drawbacks.
“We know why this story repeats itself,” says Venkatasubramanian. “It’s because AI systems fail to recognize that they affect people and society, and that people don’t—and shouldn’t—behave like compliant numbers in a database.”
In a recent paper exploring AI’s impact on society, the team argues that computer science and the social sciences need to work closely together to understand the effect that AI, and automation broadly, is having on society.
“We need better social science education for computer scientists, more financial and institutional support for joint research efforts, and a re-awakening of a ‘critical instinct’ among technologists, so that we don’t just ask what we should build and how, but why we are building it and whether we should,” Venkatasubramanian adds.
The white paper is part of a series of position statements compiled every four years by the CCC Council and members of the computing research community to inform policymakers and the public about research opportunities. Topics chosen represent areas of pressing national need.
Venkatasubramanian is working with Matt Haber, a professor in the College of Humanities, to develop a certificate that combines computing, ethics and social science elements. The Tanner Center for Human Rights also works to facilitate such cross-disciplinary discussions.
Suresh Venkatasubramanian, School of Computing, email@example.com
Rebecca Walsh, Communications Manager, firstname.lastname@example.org