In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
2020-06-21 08:08:47
Picture source @ panoramic vision
Titanium media note: this article comes from the official Wechat account Silicon Rabbit Race (ID:sv_race), author (Molly Fosco, compiled by Vivian, edited by Lu, titanium media released by authorization.
01. Labels and categories do not fully reflect the complexity and potential of individuals
Alice Xiang, who grew up in Tennessee under the Appalachian Mountains, is one of the few Asian students in predominantly white schools.
As she entered the "advanced class", she found that more and more of her peers came from upper-middle-class families. Many of her primary school classmates, whose family backgrounds are not so superior, have embarked on the road of less opportunities for the future.
This experience is still fresh in Xiang's memory because she was later admitted to elite universities such as Harvard, Oxford and Yale. In fact, this is one of the main reasons why she now specializes in algorithm fairness as a research scientist at Partnership on AI (PAI).
Growing up in Tennessee, she says, made her realize that labels and categories "may not fully reflect the complexity and potential of the individual."
When Xiang began her career, when she first trained machine learning algorithms, she found that she personally thought the relevant data would greatly affect the process. And so are her colleagues.
"it makes me uncomfortable that the people who make decisions around these algorithms live in big cities, have gone to graduate school, and don't have anything to do with people who are very different from their backgrounds," she said. It is worth noting that few of her colleagues are female or ethnic minorities.
The scarcity of women in technical jobs is a well-documented topic, and the situation has hardly improved. Compared with other technological roles across the industry, such as web development, user experience design or data science, the proportion of women in artificial intelligence is particularly worrying.
Experts from technology giants such as Google, Apple and Facebook, as well as many researchers in this field, say that artificial intelligence will revolutionize every aspect of our lives.
Artificial intelligence describes the ability of computers to integrate human intelligence into their decision-making. Modern artificial intelligence algorithms are trained on large data sets, learn skills on the basis of pattern recognition, and then predict what to do next. As a subset of artificial intelligence, machine learning is increasingly used to solve problems in various industries.
Artificial intelligence has become ubiquitous in many interactive devices and services: Face ID on iPhone, recommended products on Amazon, songs you might like on Spotify, automatic fraud detection of credit cards, controlling the heating and cooling of buildings, arranging take-off and arrival times, and so on.
Some experts predict that the technological singularity, when artificial intelligence will be on a par with or even smarter than humans, may occur in our lifetime. Some people think it could be as early as # 30, while others say it could take centuries.
If this "general artificial intelligence" becomes a reality, then jobs such as legal assistants, radiologists and hiring managers are expected to be more automated. The World Economic Forum predicts that automation will replace 75 million jobs and create 133 million new jobs by 2022.
Despite the economic promise of artificial intelligence, women now make up only 22 per cent of the world's artificial intelligence professionals.
According to analysis by LinkedIn and the World Economic Forum, only 12% of machine learning researchers are women. The technology that may one day hire us, dominate our health care, and decide decisions in trials is designed almost entirely from the perspective of white, well-educated men.
This identity in the field of artificial intelligence has had a subtle impact on the way society views women and minorities, the treatment of these groups and the way they can participate in the industry. If no changes are made, we are likely to continue to reinforce society's inherent prejudices against women and minorities.
02. Most people who criticize artificial intelligence are women and minority groups
There is already evidence that marginalized groups are at a serious disadvantage in terms of employment, access to credit and loans, and access to appropriate health services, which will only worsen as the industry develops if left unchecked.
Most people who criticize artificial intelligence are women and minorities because they are more likely to encounter algorithmic injustices.
But more and more people engaged in artificial intelligence and machine learning are committed to ensuring that the future described above does not become a reality. Researchers began to call attention to artificial intelligence ethics, an area that involves the design of "responsible" artificial intelligence. Many studies have been done by women, including ethnic minority women.
More and more people pay attention to the impact of artificial intelligence on society, forcing the field of artificial intelligence to include researchers in other disciplines except computer scientists, especially those in the social sciences and humanities.
AI ethics needs to consider and give priority to the sociological and psychological effects of technology. As morality and fairness become more and more important to the future of artificial intelligence, this future may bring more people from different backgrounds to the field, making consumers more inclusive.
03. AI is magnifying existing prejudices
"what's the weather like today, Siri?"the highest temperature in San Francisco will reach 55 degrees today, and it will be sunny most of the time."
When you read Siri's answer in your head, it's probably a woman's voice.
"people can choose to change Siri's voice to male, but almost no one will change it." Rachel Thomas, director of the Center for Applied data Ethics at the University of San Francisco. Rachel Thomas is the founder of fast.ai, fast.ai is a free online program for programmers to use artificial intelligence tools.
Amazon's Alexa and Microsoft's Cortana also default to female voices and female names. "both men and women show a preference for female assistants, and we are comfortable with women as assistants." Thomas said.
Before the artificial intelligence voice assistant, a 2008 Indiana University study found that both men and women showed a preference for female voices compared to male or computer voice. Amazon and Microsoft have publicly said that women's voice tests are better in the research and testing of their voice assistant products.
The impact is already clear. In May 2019, a study released by the United Nations Educational, Scientific and Cultural Organization (UNESCO) found that the gender of voice assistants is female, reinforcing the stereotype that women are willing to serve others.
"this is an example of the risk that artificial intelligence brings," Thomas said. "We are looking at the current society, locking it, and strengthening it."
CEO Tess Posner of AI4ALL, an educational nonprofit that aims to improve the diversity of artificial intelligence, agrees. "AI shows the bias we are born with," she said. "Assistant work is considered to be a female role, so AI is amplifying existing biases by making voice assistants sound female."
The creators of these products are not entirely without women, and Toni Reid is a woman in the duo that created Amazon Alexa. But among the artificial intelligence voice assistant designers, the vast majority of people who make decisions are white and male.
As of 2018, only 26.8% of Amazon's global managers were women and 73.2% were identified as men. That year, Bloomberg also reported that there were hardly more than five women in the meeting room when 200 Amazon employees introduced the latest results at the weekly Amazon AWS conference.
Siri was originally created by three men. Globally, 77% of Apple's technical staff are men. 49% are white, 35% are Asian, 8% are Hispanic and 6% are black.
Search CEO on Google Images, almost all men
When Apple first launched Siri, if you told her you had a heart attack, she would call 911, but she didn't respond to rape or domestic violence. If you tell Siri that you have been raped, she will reply, "I don't know what you mean by'I was raped'."
And in early 2019, if you said, "Hey, Siri, you're a bitch," she would reply, "I'd blush if I could." These were later modified in the update, but it showed that the worldview of her original design was limited.
"We still have a lot of work to do to identify biases and make sure we're dealing with these things," Posner said. "that's good, but at the end of the day, it's about power and who's building these systems."
Not just Siri and Alexa, artificial intelligence can amplify and reinforce our existing biases in countless ways.
In 2015, a study released by the University of Washington found that when searching for "CEO" on Google Images, the results were almost entirely male. Only 11% of the pictures featured women, although at the time, women accounted for 27% of CEO in the United States.
According to Pew Research Center, the number of female CEO in the United States has increased to 28% by 2019, while the proportion of female protagonists in Google's "CEO" images has dropped to 10%.
When you enter a search query for images in Google, the search algorithm reads the metadata about billions of images online, finds the most common images, and picks them out. The image search results for various jobs reflect the image of companies, organizations and the media that choose to represent these professions.
"some people think that this is because many CEO are now male," Thomas said. "but it also reinforces and magnifies our stereotype that men are CEO."
Research has proved this point. The same study by the University of Washington in 2015 found that gender stereotypes in search image results influenced people's perception of the proportion of male to female workers in a particular field.
"it actually changes the way people think about what they think." Dr. Vivienne Ming, founder of Socos Lab, said the lab is a think-tank focused on artificial intelligence, neuroscience and educational reform. "as these systems continue to cycle, they become this closed loop, reinforcing our own prejudices."
However, there is no clear consensus on how to solve this problem. "if all the voice assistants are women, then there is a problem, but what is the fair zone?" Xiang said. "similarly, for CEO image search results, 50% is the most meaningful thing to do? should we show what we actually see? or should it be somewhere in between?"
If the algorithm is trained on biased data, how can we create a fair algorithm? One option is to use additional data sets, provide the degree of bias of the model, and then rebalance the data sets accordingly, Xiang said.
For example, data published in the American Journal of Drug and Alcohol abuse (American Journal of Drug and Alcohol Abuse) show that blacks and whites use and sell drugs in a similar proportion, but blacks are about 2.6 times more likely to be arrested for drug-related crimes than whites. The data of the former can be used to adjust the dataset of the latter.
In 2017, London-based AI researcher Josie Young developed a feminist chat robot design process to help organizations build morally or socially aware chat robots and AI interfaces. Her guidelines became the central idea of a feminist chatbot called F'xa, created by an organization called Feminist Internet, to educate users about the risks of embedding prejudice in artificial intelligence systems.
The problem is that if we want to establish "fairness" in artificial intelligence systems, we must think about this concept quantitatively. As for the meaning of fairness, researchers also have different definitions.
Dr. Vivienne Ming agrees. "when people talk about the fairness of artificial intelligence, they mean differently," she said. "sometimes they talk about transparency, that is, how algorithms work. Sometimes they talk about the results of algorithms, or how they are trained. Fairness is hard to define."
05. The unfairness of the algorithm
In addition to social perceptions of women and ethnic minorities, artificial intelligence can also exert an imperceptible influence on the treatment of women and other marginalized groups.
Companies such as AT&T, Hilton and Humana have used artificial intelligence in the recruitment process to ensure that applicants meet the basic criteria for the position.
In 2018, Amazon machine learning experts found that their recruitment algorithm downgraded resumes for technical positions that included the word "female" and "punished" graduates from two all-female universities. The algorithm is based on Amazon's 10-year recruitment data, most of which are men in technical positions.
"in recruitment, we often hope that if the gender in the resume is completely erased, artificial intelligence will not learn these biases." Xiang said. But if there are more men in the pool of candidates in the training data, then "artificial intelligence will face an obvious challenge, which is biased towards men rather than women".
The result is not just in hiring.
In criminal risk assessment, artificial intelligence is used to determine the possibility of a person committing a crime again, which is then taken into account by the judge when sentencing. Like recruitment algorithms, crime risk assessment tools are usually trained based on historical data, and black Americans are more likely to be stopped by the police than whites or Hispanics, according to a report from the Bureau of Justice Statistics.
Timnit Gebru, a research scientist with Google's ethical artificial intelligence team, points out that most of the critics of artificial intelligence are women and minorities because they are more likely to encounter algorithmic injustices. "people from marginalized groups have been making real efforts to push this issue to the public."
In 2018, Massachusetts Institute of Technology researcher 喜悦 Buolamwini found that IBM, Microsoft and Face++, the most widely used facial recognition tools in the world, misrecognize female faces more often than men, and in many cases, they cannot detect dark-skinned faces at all.
This means that when facial recognition is used for security surveillance, women and minorities may be identified as threats more often than white men.
People from different fields began to enter the AI industry
"these systems are being invisibly embedded in our society," Posner said. "this not only magnifies some of the prejudices in our hearts, they may change our lives."
It is difficult for artificial intelligence to deal with situations that are not uniformly classified.
Dr Vivienne Ming, a transgender woman, has experienced first-hand that it is difficult for AI to read her gender. "when I pass a full-body scanner at an American airport, I am always marked because the ratio of my hips to shoulders is abnormal for a woman." When I am marked, the security staff will put their hands between my legs, which is so unfair. "
Will a more diverse workforce improve these problems? "of course, to some extent," Dr. Ming said, "AI is just a tool that can only do what its commanders know."
The solution is not simply to hire more women in AI. "it may be controversial," she says, "but if you think that hiring more women (in artificial intelligence) will magically solve the problem, you are wrong."
What we need, says Dr Ming, is more people who understand how algorithms affect humans. Other experts deeply agree and are working to do so.
Abeba Birhane, a researcher at AI, believes that artificial intelligence should give priority to understanding rather than prediction.
We should not only rely on algorithmic prediction patterns, but should often question why there are some fixed patterns. For example, why do criminal risk assessment tools show that black and brown people are more likely to be arrested? Could it be the result of excessive policing in their community?
Been Kim, a research scientist at Google Brain, is developing self-explanatory artificial intelligence software to increase human understanding of how technology works.
She recently set up a system as a "human translator" to understand when artificial intelligence doesn't work the way it should. For example, if an artificial intelligence system is trained to recognize zebras in images, you can use this tool to understand the weight of artificial intelligence to "stripes" when making decisions.
"you don't have to know everything about artificial intelligence models," Kim said, "but as long as you have information about safe use of this tool, that's our goal."
07. Hey, Siri, define feminism.
With the increasing popularity of automation, jobs that require interaction with machines, such as construction and factory jobs, are rapidly decreasing. On the other hand, jobs that make heavy use of interpersonal skills, such as health care and social work, are growing rapidly.
A 2018 study by the University of York found that college-educated men were less likely to work in white-collar jobs over the past 25 years, while college-educated women were more likely to work in white-collar jobs. What is the biggest reason for the change? There is an increased demand for social skills in jobs such as doctors, software engineers and economists.
These jobs require relatively high emotional intelligence (emotional intelligence), which is quite difficult to automate. Several studies have shown that women score higher than men on EI tests, including every subscale of EI, such as understanding, expressing and perceiving emotions.
This is not to say that every woman has a higher EI than every man, nor that these traits are physiological. Some studies have shown that women are more likely to develop these traits because of the constraints of social conditions.
If ethics, which also requires a high level of EI, continues to become increasingly important in the field of artificial intelligence, this demand may attract more women to the industry. According to the National Science Foundation, women have accounted for at least half or more of all social scientists in the United States since the early 1990s.
"at PAI, we work a lot with female researchers, focusing on the ethics and transparency of artificial intelligence." "in these areas of research, the proportion of women is quite high," Xiang said.
Xiang's own background honed her EI skills. She grew up in remote areas before entering a world-class elite university. These growing experiences influenced her later research on the fairness of the algorithm. What are the chances of a person doing a good job or defaulting on a loan? It cannot be determined by historical data alone.
Xiang believes that domain expertise, that is, expertise in a certain field, is also becoming more and more important to the artificial intelligence industry. Many of the colleagues she met were STEM majors who worked in jobs that were not directly related to STEM after graduating from college and then switched to artificial intelligence.
Xiang, who worked in statistics, economics and law before working in artificial intelligence, has her own expertise in these areas and now applies them to research.
Thomas, who runs fast.ai with her husband, wants to put artificial intelligence in the hands of a very wide and diverse group of people in different fields. "We believe that experts in relevant fields are the people who are most familiar with their own problems," Thomas said. "We teach experts in these different fields to use deep learning instead of looking for a PhD in deep learning areas that are interested in other areas."
Several fast.ai students at Thomas are using artificial intelligence in their fields to improve research in their fields. Alena Harley, director of machine learning at the Human Longevity Institute and an alumnus of fast.ai, is using the AI algorithm to identify the source of metastatic cancer. In recent trials, Harley reduced the error rate by more than 30 per cent.
When you ask Siri what feminist values are, she will reply, "I found this online." And call up the search results according to the popularity. In many cases, the most frequent article is written by a career guidance counselor entitled "what is feminism and why so many women and men hate it?"
F'xa, a feminist chat robot, has a slightly different answer. She would say that feminist values may mean different things to different people, depending on their background and the challenges they face.
Original title: Code Switch
Author: Molly Fosco
Original text link:
Https://trix-magazine.com/global-affairs/code-switch/
Https://www.toutiao.com/i6840586371678077447
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.