Each expert shared insights about subfields they think will make strides in the year ahead, like multitask learning and semi-supervised learning, and everyone seemed to agree that Transformer indeed transformed natural language AI in 2019, but they also coalesced around their shared hope that the AI field will continue to change for the better in 2020.
One person who spoke at length with VentureBeat about how the AI field can evolve in the year ahead is Celeste Kidd, director of the Kidd Lab at University of California, Berkeley. She told us she hopes neural networks lose their reputation for being black boxes, and that more people in machine learning develop realistic opinions of what babies can learn compared to neural nets, but she also talked about the lack of women in machine learning and sexual harassment.
She was Time Person of the Year in 2017 along with other women associated with the #MeToo movement, and last month she gave the opening keynote address at NeurIPS, the largest AI research conference in the world.
In her speech, Kidd took a deep dive into what machine learning practitioners should know about the human mind – how people form beliefs and how they can be quickly led to believe falsities when content recommendation AI maximizes for engagement. She also talked about her own experience with sexual harassment, and the need to dispel the myth among men in machine learning that being alone with a female colleague can lead to sexual harassment allegations and the end of their careers. When that fear leads to missed opportunities for women in the field, Kidd said, even well intentioned people with no desire to inflict harm can instead damage the careers of women.
The speech then ended with a standing ovation, which was uncharacteristic for a machine learning research conference.
Misperceptions held by men in machine learning is something she said nobody wants to talk about for a NeurIPS keynote address, but it's something she felt she had to do given the opportunity to speak with so many people who are directly responsible for decisions made at their companies or training female students at universities.
In 2018, analysis by Element AI found that the number of women authors of papers published at major AI research conferences like NeurIPS remains below 20%, while a 2019 Nesta report on gender diversity in AI found that less than 30% of AI research published on arXiv in the U.S. had a female author. Some countries like the Netherlands surpass 40% female authors but no nation achieves gender parity.
Bringing more women into machine learning research requires taking sexual harassment seriously and exposing predators, Kidd said, because she believes it's a contributor to the leaky tech pipeline. She also stressed that for the average person, it's not a single dramatic event, but more often 1,000 seemingly small events – what she called "death by 1,000 paper cuts" – that push women out of the field.
The day after she gave her speech, Kidd said she tried without success to reach the poster session at the conference but ended up being stopped by men and women – men with thanks for calling this fear misguided, and women who said they were invited to social events with peers.
"You learn just as much from your peers, if not more than you do from your mentors," she said. "So when you have a lab treating a woman as otherly, if you're not treating her the same, she doesn't get the same access to all of the informal training opportunities that exist, all the opportunities that everybody else in the lab has for learning from their peers."
Inviting women to be part of social outings and dissuading the misperception that mentoring women will lead to sexual harassment allegations and the end of men's careers is important, but getting rid of serial predators is critical, Kidd believes, to achieving parity and close the AI research gender gap. Resisting the Pence Rule that men should avoid being alone with a female colleague unless their wife or others are present could also help.
"If you set up a rule like 'I'm only going to meet with women with the door open' [or] 'I'm only going to meet with women when there's somebody else present,' you're introducing a systemic inequity that means that she doesn't get as much access to your mentorship to somebody that doesn't have to have those particular circumstances in place," Kidd said.
One thing that stood out from the interview: It's not just an individual who loses out when a woman is pushed out of the industry or a persistent gender gap emerges in machine learning research. It's a loss for the machine learning industry, as well. And as AI spreads to all corners of business and society, that means everyone loses.
When it was introduced in September 2019, Google's ALBERT language model achieved SOTA results on popular natural language understanding (NLU) benchmarks like GLUE, RACE, and SQuAD 2.0. Google has… (via Synced Review)
A team of researchers from Mila and Google Brain believe simple pencil sketches could help AI models generalize to a better understanding of unseen images. Deep neural networks excel in practical… (via Synced Review)
Theresa Brown writes that the recent Google report that shows an AI system being able to read a mammogram more accurately than doctors is a welcome development to cut down on inaccurate readings… (via CNN)
With black box AI, people are refused or given loans, accepted or denied university admission, offered a lower or higher price on car insurance, and more, all at the hands of AI systems that usually offer no explanations.
April 28-29, 2020 | Two Bit Circus | Los Angeles, CA
GamesBeat Summit 2020 is the most intimate game industry event bringing together senior-level execs, developers, and investors who this year will focus on how companies can best navigate transitions to new platforms, new machines, new markets, and new video game business models.