first_imgShareDavid [email protected] [email protected] U. expert: ‘Humanism’ is an empty buzzword for AI developersKirsten Ostherr available to discuss her Washington Post op-edHOUSTON — (June 22, 2018) — As a growing number of tech companies tout their progress in the development of “humanistic” artificial intelligence, Rice University media scholar Kirsten Ostherr warns that such rhetoric is often unclear and frequently meaningless. “Unless these companies reconsider their underlying approach,” Ostherr wrote in an op-ed this week for the Washington Post, “their words will remain empty.”Ostherr, the Gladys Louise Fox Professor of English at Rice, is available to discuss the issue with media.“Apple describes Siri as ‘humanistic AI — artificial intelligence designed to meet human needs by collaborating [with] and augmenting people,’” Ostherr wrote. “Microsoft Chief Executive Satya Nadella has said, ‘Human-centered AI can help create a better world.’ Google’s Fei-Fei Li has called human-centered AI ‘AI for Good and AI for All.’ Facebook Chief Executive Mark Zuckerberg believes the company can build ‘long-term social infrastructure to bring humanity together.’”But what does any of this mean?Ostherr points to a long list of AI’s current failings, including an “egregious case of Google’s image-labeling algorithm that classified black people as gorillas.” And even as it was promoting the idea of human-centered AI, Google was actively pursuing Project Maven, “a major Department of Defense contract to develop artificial intelligence for use in drones,” Ostherr wrote.“Photographic images from cameras mounted on drones are widely used to gather visual evidence and provide forensic truth value for military decision-makers,” wrote Ostherr. “But those images are not transparently legible, and it takes a huge amount of human labor to interpret the data, especially in the categories of age, sex and race. Numerous examples of misinterpreted drone footage identifying the wrong target already exist.”After strong backlash from the public and within its own ranks, Google recently announced it would end the Project Maven military contract in 2019. “And yet even as technology lags behind human capability when it comes to contextual sensitivity,” wrote Ostherr, “we still hope to entrust it with life or death decisions.”Achieving true human-centered AI would require programmers to collaborate with scholars and authorities in other fields, including the humanities. “But simply adding humanistic researchers to examine the social impact of AI after it is deployed, without also changing the development process, probably won’t get us very far,” wrote Ostherr. “Calling AI humanistic without truly integrating experts in the humanities who can bring diverse perspectives to the ethical reasoning behind these initiatives will lead only to continued cases of bias and further erosion of public trust.”To read Ostherr’s full op-ed, go here (subscription is required).For more information or to schedule an interview with Ostherr, contact David Ruth, director of national media relations at Rice, at [email protected] or 713-348-6327.-30-Rice University has a VideoLink ReadyCam TV interview studio. ReadyCam is capable of transmitting broadcast-quality standard-definition and high-definition video directly to all news media organizations around the world 24/7.Image for download:” alt=”last_img” />

Leave a Reply