In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Apache Spark is a fast and general computing engine specially designed for large-scale data processing. Spark is a general parallel framework like Hadoop MapReduce opened by UC Berkeley AMP lab (AMP Lab of the University of California, Berkeley). Spark has the advantages of Hadoop MapReduce; but what is different from MapReduce is that the intermediate output of Job can be saved in memory, so it is no longer necessary to read and write HDFS, so Spark can be better applied to iterative MapReduce algorithms such as data mining and machine learning.
As for how to learn, of course, is to read books to learn, paper books or e-books can be. If you have money, you can also choose to study in a training institution.
Then, to learn this piece, we must not work behind closed doors, but should communicate more and communicate more, visit more relevant community forums, and ask the gods for advice.
Another thing is to do more. It is better to do a project by yourself than to read 10 books.
I will continue to explain to you this knowledge, please look forward to it!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.