Can we make our robots less biased than we are?


Adherence to this declaration will prohibit researchers from working in new areas of search and rescue robots or “social robotics”. Dr. A Bethel research project has developed a technology that uses small, humanoid robots to interview children who have been abused, sexually harassed, trafficked or otherwise traumatized. In her most recent study, 250 children and adolescents whose grandparents were interviewed were often willing to give information to a robot that they would not disclose to an adult.

Having a robot in another room by “driving” the investigator can provide less painful and more informative interviews with child survivors, said Dr. Ava, a trained forensic interviewer. Bethel said.

“You have to understand the problem area before you can talk about robotics and police work.” “They’re generalizing a lot without a lot of information.”

Dr. Cra. Crawford is both a signatory to the “No Justice, No Robots” and Black in Computing open letter. “And you know that whenever something like this happens, or an awareness comes up, especially in the community where I work, I try to make sure I support it,” he said.

Dr. Jen. Jenkins refused to sign the “no justice” statement. “I thought it was worth considering,” he said. “But in the end, I thought the big issue was really, in the room – in the research lab, in the classroom and in the development team, representing the executive board.” He said there should be ethical discussions on the first question of that fundamental civil-right.

Dr. Howard did not sign both statements. She reiterated her point that biased algorithms are, in part, squid demographic, white, masculine, competent-physical – the result of designing and testing software.

“If outsiders with ethical values ​​don’t work with these law enforcement agencies, then who is?” She said. “When you say ‘no’, others will say ‘yes’. No one in the room says it’s not good, ‘Um, I don’t think the robot should be killed.’ “