In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
To learn big data, we must first understand the learning route of big data. We must first figure out what to learn first and then what to learn. The big learning framework is known. The rest is to learn from the most basic step by step.
Here is a popular learning route for everyone: hadoop ecosphere--Strom--Spark--algorithm.
So learning hadoop is the first step, here to declare, before learning hadoop need to have java foundation, because hadoop bottom is written in java; also need to learn to use linux basic shell command system level, because you learn hadoop first will install hadoop. Hadoop plays an important role in the big data technology system. Hadoop is the foundation of big data technology. The solid degree of mastery of Hadoop basic knowledge will determine how far we will go on the road of big data technology.
Here's how to get started learning hadoop.
The idea of this article is to install and deploy Apache Hadoop 2.x version as the main line to introduce the architecture of Hadoop 2.x, the working principle of each module, and technical details.
Installation is not the goal, understanding Hadoop through installation is the goal.
Hadoop environment construction, need more big data learning video tutorial can click on the button group to receive 606859705
Part 1: Linux Environment Installation
Hadoop runs on Linux, although it can also run on Windows with the help of tools, but it is recommended to run on Linux systems. The first part introduces the installation, configuration, Java JDK installation, etc. of Linux environment.
Part 2: Hadoop Native Mode Installation
Hadoop native mode is only used for local development and debugging, or for quick installation and experience of Hadoop. This section is a brief introduction.
Part 3: Hadoop Pseudo-Distributed Mode Installation
Learning Hadoop is generally done in a pseudo-distributed mode. This pattern is to run Hadoop modules on a single machine, pseudo-distributed means that although each module runs separately on each process, it only runs on an operating system, not truly distributed.
Part 4: Fully Distributed Installation
Fully distributed mode is the mode adopted by production environment. Hadoop runs on server cluster, and production environment generally does HA to achieve high availability.
Part 5: Hadoop HA Installation
HA refers to high availability. In order to solve the single point of failure problem of Hadoop, HA is generally deployed in production environment. This section describes how to configure Hadoop 2.x for high availability and briefly describes how HA works.
During the installation process, a brief introduction to the knowledge involved will be interspersed. I hope it helps.
The above environment is only about the framework, due to limited time, specific how to operate can leave a message to communicate with me. Need big data learning video tutorial can enter the buckle group to receive
Once the environment is set up, try writing mapreduce to run it packaged. When you have no doubts about the application programming aspects of hadoop, try to understand the core ideas of mareduce, especially map, shuffle, join, reduce, etc.
For beginners will encounter a lot of problems, this is normal, but encountered problems are not terrible, as long as you find a way to solve their own ability will be improved little by little, here I wish in the big data on the road to immortality partners learn something successful
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.