In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
There has been a surge in the number of AI papers, but only 4% of the researchers who really think it is a "rigid demand"!
This conclusion comes from the latest survey by Nature.
More accurately, the results of a survey of researchers who will use AI tools in scientific research.
These people are selected from more than 40,000 researchers who published papers in the last four months of 2022, spread all over the world and come from different disciplines.
In addition, "insiders" who developed AI tools and "outsiders" who did not use AI tools in the research were also included in the survey, with a total number of 1600 +.
At present, the relevant results have been published under the title "AI and science: what 1600 researchers think".
What exactly do scientific researchers think of AI tools? Let's move on.
AI views of 1600 researchers the survey focused on researchers' views on machine learning and generative AI.
In order to ensure the objectivity and validity of the survey results, as mentioned above, Nature contacted more than 40,000 scientists from around the world who published papers in the last four months of 2022 and invited readers of the Nature newsletter to participate in the survey.
Finally, 1659 interviewees were selected and the specific sample composition is as follows:
Most of the respondents were from Asia (28%), Europe (nearly 1 Universe 3) and North America (20%).
Of these, 48 per cent of those who directly developed or studied AI,30% used AI,22% in the study and did not use AI in the study.
Let's take a look at the detailed results.
According to the survey, among those who used AI in the study, more than 1x4 believe that AI tools will become a "necessity" in their field over the next decade.
But only 4 per cent of people think AI tools are now a "necessity", while another 47 per cent think artificial intelligence will be "very useful" in the future.
By contrast, researchers who do not use AI are not very interested. Even so, 9 per cent thought these technologies would become "essential" over the next decade, while 34 per cent said they would be "very useful".
In the survey on machine learning, respondents were asked to choose the positive effects of AI tools. 2Accord 3 respondents believe that AI provides faster data processing, 58% believe that AI accelerates previously infeasible computing, and 55% mention that AI saves time and money.
Respondents believe that the possible negative effects of AI are: leading to more reliance on pattern recognition rather than deep understanding (69%), strengthening bias or discrimination in data (58%), increasing the probability of fraud (55%), and blind use may lead to non-reproduction of research (53%).
Let's take a look at the researchers' views on generative AI tools.
Most people think that one of the advantages of generative AI tools is summary and translation, which can help non-native English researchers improve the grammar and style of their papers. Second, its ability to write code has also been approved.
But there are some problems with generative AI. Researchers are most concerned about inaccurate information dissemination (68%), making plagiarism easier and more difficult to detect (68%), and introducing errors or inaccurate content into papers / codes (66%).
Respondents added that if the AI tools used for medical diagnosis were trained based on biased data, they were concerned about the possibility of bogus research, false information and long-standing biases.
In addition, according to the statistics of usage frequency, even for researchers who are interested in AI, only a small number of researchers often use large language models at work.
Among all the surveyed groups, the most common thing the researchers did with AI was creative entertainment that had nothing to do with the research *, followed by using AI tools to write code, conceive research ideas and help write papers.
Some scientists are not satisfied with the output of the large model. A researcher who assisted in editing the paper with a large model wrote:
It feels like ChatGPT has copied all the bad writing habits of human beings.
Johannes Niskanen, a physicist at the University of Turku in Finland, said:
If we use AI to read and write articles, science will soon change from "for humans by humans" to "for machines by machines".
In the survey, Nature also delved into the researchers' views on the difficulties facing the development of AI.
The dilemma facing the development of AI about half of the researchers said they had encountered obstacles in developing or using AI.
The most worrying problems for researchers who develop AI are the lack of computing resources, the lack of research funding, and the lack of high-quality data for AI training.
People who work in other fields but use AI in research are more worried about the lack of sufficiently skilled scientists and training resources, as well as security and privacy.
Researchers who do not use AI say they do not need AI or think that AI is impractical, or that they lack experienced time to study these AI tools.
It is worth mentioning that business giants dominate AI's computing resources and ownership of AI tools are also issues of concern to respondents.
23% of AI tool developers say they work with or work for the companies that develop AI tools (the most frequently mentioned are Google and Microsoft), while only 7% of those who use AI alone have this experience.
Overall, more than half of the respondents thought it was "very" or "somewhat" important for researchers using AI to work with scientists at these companies.
In addition to development, there are also some problems in usage.
Researchers have previously said that the blind use of AI tools in scientific research can lead to erroneous, false and unreproducible results.
Lior Shamir, a computer scientist at Kansas State University in Manhattan, thinks:
Machine learning can be useful sometimes, but AI causes more problems than it provides. Scientists using AI without knowing what they are doing can lead to false discoveries.
When asked whether journal editors and peer reviewers could adequately review papers using artificial intelligence, respondents were divided.
About half of the researchers who used AI but did not develop AI directly in the study said they were not sure, while 1jump 4 thought the review was adequate, and some said it was inadequate. Researchers who develop AI directly tend to take a more positive view of the editing and censorship process.
In addition, Nature asked respondents how worried they were about the seven potential effects of AI in society.
The dissemination of misinformation has become the biggest concern for researchers, with people in 2max 3 saying they are "very worried" or "very worried" about it.
The least worrying thing is that AI may pose a threat to human survival.
Reference link: https://www.nature.com/ articles / d41586-023-02980-0
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.