[ad_1]
Last week, forward Google artificial intelligence Researcher Timnit Gebru said she was fired by the company after executives asked her to retract or remove her name from a research paper, and she objected. Google maintains that she has resigned, and the CEO of Alphabet Sundar pichai said in a company note on Wednesday that he would investigate what happened.
The episode is a precise reminder of the influence and power of tech companies in their field. Artificial intelligence underlies lucrative products like Google’s search engine and Amazon’s virtual assistant Alexa. Large companies produce influential research papers, fund academic conferences, compete to hire top researchers, and have the data centers for large-scale AI experiments. A recent study found that the majority of tenure-track professors at four top universities that disclose funding sources had received Big Tech support.
Ben Recht, an associate professor at the University of California, Berkeley, who spent time at Google as a visiting professor, says his fellow researchers sometimes forget that business interests don’t just come from a love for science. science. “Business research is amazing, and there have been some amazing things that have come out of Bell Labs, PARC and Google,” he says. “But it is strange to claim that university research and business research are the same.”
Ali Alkhatib, a researcher at the Center for Applied Data Ethics at the University of San Francisco, says the questions raised by Google’s treatment of Gebru risk undermining all of the company’s research. “It seems precarious to quote because there may be things behind the scenes that they haven’t been able to talk about, which we will learn about later,” he says.
Alkhatib, who previously worked in Microsoft’s research division, says he understands that business research comes with limitations. But he would like to see Google make visible changes to regain the trust of searchers inside and outside the company, perhaps by isolating his search group from other parts of Google.
The document that led to the release of Gebru from Google highlighted ethical issues raised by AI technology that works with language. Google chief searcher Jeff Dean said in a statement last week that he was “not meeting our publication bar.”
Gebru said managers may have viewed the job as a threat to Google’s business interests, or an excuse to fire her for criticizing the lack of diversity in the company’s AI group. Other Google researchers have publicly stated that Google appears to have used its internal search review process to punish her. More than 2,300 Google employees, including many AI researchers, have signed an open letter ask the company to establish clear guidelines on how the research will be managed.
Meredith Whittaker, faculty director at New York University’s AI Now Institute, says what happened to Gebru is a reminder that while companies like Google are encouraging researchers to think of themselves as independent academics , companies prioritize results above academic standards. “It’s easy to forget, but at any time a company can energize your work or shape it to function more like PR than knowledge production in the public interest,” she says.
Whittaker worked at Google for 13 years but left in 2019, claiming that the company had retaliated against her for organizing a walkout for sexual harassment and undermine its work by raising ethical concerns about AI. She helped organize employee protests against an AI contract with the Pentagon that the company eventually gave up, although it took other defense contracts.
Machine learning was a dark dimension of academia until around 2012, when Google and others technology companies are intensely interested in breakthroughs that made computers much better at recognizing speech and pictures.
The research and advertising company, quickly followed by competitors like Facebook, hiring and acquired leading academics, and urged them to continue publishing articles between jobs on corporate systems. Even Apple traditionally with tight lips committed to become more open with its research, with the aim of attracting AI talent. Articles with corporate authors and attendees with corporate badges have flooded conferences which are the primary publishing venues in the field.
[ad_2]