In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
I. CCAH Administrator Hadoop Manager Certification
Certification preparation recommendation: Hadoop administrator training
Exam format: 90 minutes; 70% pass; 60 multiple choice questions (will be prompted as single or multiple choice)
Training content
Through the lecturer's explanation in class, as well as practical exercises, students will learn the following:
Cloudera Manager manages the features of the cluster, such as log summarization, configuration management, reporting, alarm, and service management. How YARN, MapReduce, Spark and HDFS work.
How to choose the right hardware and architecture for your cluster.
How to seamlessly integrate the Hadoop cluster with the existing systems of the enterprise.
How to use Flume for real-time data acquisition and how to use Sqoop to import and export data between RDBMS and Hadoop clusters.
How to configure a fair scheduler to provide service level guarantee for multiple users on Hadoop.
Best operation and maintenance practices for Hadoop cluster in product environment.
Hadoop cluster troubleshooting, problem diagnosis and performance tuning.
Training object and trainee basis
"for system administrators and IT managers, Linux experience is required, no Apache Hadoop foundation is required.
Upon completion of this course, we recommend that students prepare and register for the Cloudera Certified Hadoop Administrator examination (CCAH). Passing and obtaining the certificate is a strong basis for proving to the company and customers their skills and expertise in the Hadoop field.
II. CCA Spark and Hadoop Developer developer certification
Certification preparation suggestion: Spark andHadoop developer training
Exam form: 120 minutes; 70% pass; solve the practical problems that need to be passed on the 10'12 CDH5 cluster.
Training content
Through the lecturer's explanation in class, as well as practical exercises, students will learn the following:
Distributed storage and processing of data on the Hadoop cluster.
Write, configure, and deploy Apache Spark applications on the Hadoop cluster.
Use Spark shell for interactive data analysis.
Use Spark SQL queries to process structured data.
Use Spark Streaming to process streaming data.
Use Flume and Kafka to collect streaming data for Spark Streaming.
Training object and trainee basis
This course is suitable for developers and engineers with programming experience. There is no need for Apache Hadoop basics, and the code and exercises involved in the introduction to Apache Spark in the training content use Scala and Python, so you need to master at least one of these two programming languages. Proficient in Linux command line is required. Basic knowledge of SQL.
III. CCA Data Analyst data analyst certification
Certification preparation recommendation: DataAnalyst data analyst training
Exam format: 120 minutes; 70% pass; solve 10 or 12 customer questions, for each question, candidates must give an accurate technical solution that meets all the requirements. Candidates can use any tool or combination of tools on the cluster.
Training content
Through the lecturer's lectures and hands-on exercises, students will become familiar with the Hadoop ecosystem. The topics of study include:
Functions provided by Pig, Hive, and Impala for data acquisition, storage, and analysis.
The basic principles of Apache Hadoop, and data ETL using Hadoop tools (extraction,
Convert and load), retrieve, and process.
How Pig, Hive, and Impala improve the efficiency of typical analysis tasks.
Join a wide variety of data sets to gain valuable business insight.
Perform real-time, complex dataset queries.
Trainee and trainee base
This course is designed for data analysts, business intelligence experts, developers, system architects, and database administrators. Trainees must have a certain level of SQL knowledge and are basically familiar with the Linux command line. It would be more helpful, but not necessary, for the trainee to be familiar with at least one scripting language (for example, Bash scripting, Perl, Python, and Ruby). In addition, trainees do not need to have knowledge of Apache Hadoop.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.