In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
http://blog.itpub.net/31545819/viewspace-2285058/
As everyone knew, although artificial intelligence was a new technology, it had a long history. The artificial intelligence that was boiling now was only a golden period that had experienced several cold winters.
But another recent wave of "AI winter" about deep learning has once again pushed artificial intelligence to the public opinion, and a new study aimed at solving "elephants in the room" has boosted the negative hype about AI. Are deep learning systems seriously flawed? Are we really going the wrong way?
Recent research on convolutional neural nets has focused on target trackers and detectors, which are specialized versions of more general convolutional networks. Instead of classifying photos into cats and dogs, the object detector needs to process the entire image, eliminating insignificant sub-regions and framing detected objects in rectangles.
One big problem that all neural networks have yet to solve is the existence of adversarial images. If a neural net correctly classifies a photo as a cat, it may change the pixels in the image to such a tiny degree that the image is classified as something else, such as a dog, but the change is so small that it is almost imperceptible. That is, the hostile image looks like a cat, but it is classified as a dog or something else. It's a pretty clever idea, but we can take it a step further and create so-called generic adversarial images, which could trick a set of neural networks into ignoring their exact architecture or training.
Last month, two researchers from York University and the University of Toronto published a paper,"The Elephant in the Room," which caused a huge stir in the academic community and exposed the shortcomings of using neural networks for object detection. They put elephants in the room to create another confrontational image. Using a trained object detector, a correctly labeled scene is revealed. Then an unexpected object was placed in the scene (an elephant), and it turned out that the object detector not only failed to correctly display the elephant, but also incorrectly classified the target object.
Personally, I haven't overcome the fact that the object detector didn't detect elephants.
Yes, there was an elephant in the room, the object detector was doing something artificial instead of discussing it, and the cup was not detected.
But that's enough to argue that current AI vision systems have huge flaws.
To quote a recent article by Quanta:
"This is smart and important research that reminds us that 'deep learning' isn't deep. New York University neuroscientist Gary Marcus said.
True, deep learning isn't that deep, but that's not a reason to convince most readers. Because of the overestimation of AI, people react to this research wrongly, especially deep learning. We don't have artificial brains yet, and our deep neural networks are tiny compared to brains. However, we have to be sure of their ability to analyze data, even if this grass-roots change, we still need to study.
Deep learning is not deep compared to the brain. The study was the equivalent of showing the eye and optic nerve pictures and being surprised that it couldn't see them.
Our neural networks are the building blocks of future artificial intelligence. Deep neural networks mean nothing to the world. It doesn't have the internal language to tell that the elephant isn't there, so it doesn't look at the elephant a second time.
At the moment our deep neural networks are just statistical analyzers, and if you give them a statistical anomaly, they get it wrong. That's not a reason for harsh criticism. Think of it as a three-year-old child, and it's already doing well. This is not an artificial child, but a real child, a small isolated "artificial brain" with a small number of simulated nerves inside.
Still, the study raises some very real questions. After all, statistically speaking, it's not surprising that an elephant is misclassified; it's just a statistical anomaly. Surprisingly, the presence of elephants can cause errors in other areas of the image.
Why is there interference?
Which brings us back to another confrontational image. It seems that adversarial images do not belong to the statistical distribution of the network training set because they are different from natural images. If you find that objects newly added to a graph are misclassified, they are usually a more regular band or regular noise, completely different from what happens in natural images. This noise places the image outside of the network trained image set, so misclassification is reasonable.
How does this apply to elephants in the room?
Same argument. Elephants are cut and pasted into the image. The presence of editing edges and background variations may move images from the natural image set to the operational image set, so the problems encountered are all the same.
The authors speculate that this is indeed part of the problem, but they also suggest other possibilities. In particular, you might think that placing elephants at a distance from misclassified objects would not cause non-locational problems, but this expectation does not take into account the fact that current neural networks are not adaptive and must be trained to detect objects of different sizes in images, which introduces non-local effects.
The study doesn't highlight the fact that deep learning networks exaggerate their abilities. It emphasizes that without understanding what is happening, people will misjudge the problem as an anomaly in the target detector. We need to continue to investigate how adversarial inputs reveal how neural networks actually work, and true AI does not involve just one neural network, no matter how deep it is.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.