Technological systems were developed to make tasks easier for people. But our human biases may limit our trust of those systems. Venkata Sriram Siddhardh (Sid) Nadendla, assistant professor of computer science at Missouri S&T, focuses his research on building human-systems partnerships so that humans and computers work as a team.
“I look at issues such as trust and misalignment in socio-technical systems,” Nadendla says. “Issues of trust can arise because the system isn’t secure or able to handle threats from external sources. The system may be too complex to understand or doesn’t offer services fairly across diverse communities.”
Nadendla is looking at systems use in a few domains, including transportation – the way humans interact with self-driving cars or use Google maps to navigate. He says people sometimes distrust even the driver-assist systems in their cars. Nadendla also investigates systems use in mining to learn how trapped miners might use systems to help themselves escape.
But first, he says, researchers need to define trust and fairness – concepts that can be hard to nail down.
“The question is, which fairness concept matters most given the context,” he says. “What Missourians consider to be fair might not be thought of as fair in Florida or New York. It’s very contextual and can’t be quantified.”
Nadendla says the problem is so new – emerging as a computer science field in just the last five years – that researchers are just starting to get real-world feedback. Once they establish some parameters of trust and fairness, they can determine how to encourage teamwork between humans and machines.
“If a system has a limitation, the system needs to communicate that in a way people can understand,” he says. “If a person needs something, how does the person communicate that to the system? It’s a fundamental problem, and I don’t know that we’ll solve it in my lifetime.”
Psychology and mechanical engineering play a role in Nadendla’s research. He says mechanical engineers contribute the control aspect of machines, and psychologists bring the human cognition element. Nadendla is working with collaborators at Indiana University, the University of Illinois at Urbana-Champaign, and colleagues in central Florida and within industry.
Another aspect of Nadendla’s research is predicting healthy work-time limits for people such as taxi drivers, manufacturing workers – even social media monitors who filter very violent or potentially harmful images and information. Nadendla says his team has developed algorithms to predict when the workers are exhausted and at risk for mental health problems. In data tests using Chicago taxi drivers, the algorithms were able to predict the best stopping times within 1% error for about a third of the drivers.
Nadendla says that breaking new ground in research comes with a lot of responsibility. He adds that it’s an exciting journey, especially since his students are at the forefront of discoveries.
“I basically offer them the problem, and they are the ones who solve it,” Nadendla says. “They come up with brilliant ideas.”
Fascinating and important work.
In the future perhaps medicine can be added to deal with the often serious mistrust patients have
for their test results and what their Doctor says about them. (Please see Page 7 of The New England Journal of Medicine for one Physician’s viewpoint on trust in the 1/6/2022 issue)$