In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article introduces the relevant knowledge of "how Eclipse executes MapReduce programs remotely". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
1. Environmental description
Win7 64 bit
Hadoop-1.2.1 Virtualbox three hosts bd11,bd12,bd13
Eclipse Indigo must use 3 series. If the plug-in is placed in 4.x, it will report an error. It is a problem with classpath in META-INF/MANIFEST.MF, and the class will not be found.
Self-compiled plug-ins for hadoop-1.2.1
2. Plug-in parameter configuration 2.1and Eclipse configuration
Configure the local hadoop path
2.2. Plug-in configuration-General
Advanced parameters parameter name parameter value reference document fs.default.namehdfs://bd11:9000 reference core-site.xmlhadoop.tmp.dir/home/wukong/usr/hadoop-tmp reference core-site.xmlmapred.job.trackerbd11:9001 reference mapred-site.xml2.3, possible error reported
When the above configuration is not properly configured, an error may be reported.
3. Run 3.1, run parameters
Yes, you can write down the parameters.
3.2. Running process
3.3. Run the result file
4. Possible error 4.1, opening / refreshing directory / creating directory does not respond and does not report an error
It may be that the referenced jar package or the above configured parameters are incorrect
4.2.Run the mapreduce program and throw an exception with different serverIPC version and client Server IPC version 9 cannot communicate with client version 4
This situation is caused by inconsistencies between the referenced version of the hadoop-related jar and the server.
4.3, permission question 1
The questions are as follows
21:39:29 on 14-08-21 ERROR security.UserGroupInformation: PriviledgedActionException as:zhaiph cause:java.io.IOException: Failed to set permissions of path: tmphadoop-xxx.staging to 0700Exception in thread "main" java.io.IOException: Failed to set permissions of path: tmphadoop-xxx.staging to 0700 at org.apache.hadoop.fs.FileUtil.checkReturnValue (FileUtil.java:691) at org.apache.hadoop.fs.FileUtil.setPermission (FileUtil.java:664) at org.apache. Hadoop.fs.RawLocalFileSystem.setPermission (RawLocalFileSystem.java:514) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs (RawLocalFileSystem.java:349) at org.apache.hadoop.fs.FilterFileSystem.mkdirs (FilterFileSystem.java:193) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir (JobSubmissionFiles.java:126) at org.apache.hadoop.mapred.JobClient$2.run (JobClient.java:942) at org.apache.hadoop.mapred.JobClient$2 .run (JobClient.java:936) at java.security.AccessController.doPrivileged (Native Method) at javax.security.auth.Subject.doAs (Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1190) at org.apache.hadoop.mapred.JobClient.submitJobInternal (JobClient.java:936) at org.apache.hadoop.mapreduce.Job.submit (Job.java:550) at Org.apache.hadoop.mapreduce.Job.waitForCompletion (Job.java:580) at MaxTemperature.main (MaxTemperature.java:36)
Solution.
The problem with hadoop-core-1.2.1.jar. There are two ways to either recompile jar or change the source text org.apache.hadoop.fs.FileUtil.java. Change the content of the source file as follows
Private static void checkReturnValue (boolean rv, File pcent FsPermission permission) throws IOException {/ / if (! rv) {/ / throw new IOException ("Failed to set permissions of path:" + p + / / "to" + / / String.format ("o", permission.toShort () / /} 4.4, permission question 2
Error description
14-08-21 21:43:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable14/08/21 21:43:12 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.14/08/21 21:43:12 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf (Class) or JobConf#setJar (String). 14-08-21 21:43:12 INFO input.FileInputFormat: Total input paths to process: 114-08-21 21:43:12 WARN snappy.LoadSnappy: Snappy native library not loaded14/08/21 21:43:13 INFO mapred.JobClient: Running job: job_local1395395251_000114/08/21 21:43:13 WARN mapred.LocalJobRunner: job_local1395395251_0001org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=Johnson, access=WRITE Inode= "ch02": wukong:supergroup:rwxr-xr-x at sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance (Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance (Unknown Source) at java.lang.reflect.Constructor.newInstance (Unknown Source) at org.apache.hadoop.ipc.RemoteException.instantiateException (RemoteException.java:95) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException (RemoteException.java:57) at org.apache. Hadoop.hdfs.DFSClient.mkdirs (DFSClient.java:1459) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs (DistributedFileSystem.java:362) at org.apache.hadoop.fs.FileSystem.mkdirs (FileSystem.java:1161) at org.apache.hadoop.mapred.FileOutputCommitter.setupJob (FileOutputCommitter.java:52) at org.apache.hadoop.mapred.LocalJobRunner$Job.run (LocalJobRunner.java:319) Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=Johnson Access=WRITE Inode= "ch02": wukong:supergroup:rwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check (FSPermissionChecker.java:217) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check (FSPermissionChecker.java:197) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission (FSPermissionChecker.java:141) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission (FSNamesystem.java:5758) at org.apache. Hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess (FSNamesystem.java:5731) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal (FSNamesystem.java:2502) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs (FSNamesystem.java:2469) at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs (NameNode.java:911) at sun.reflect.NativeMethodAccessorImpl.invoke0 (NativeMethod) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl. Java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:606) at org.apache.hadoop.ipc.RPC$Server.call (RPC.java:587) at org.apache.hadoop.ipc.Server$Handler$1.run (Server.java:1432) at org.apache.hadoop.ipc.Server$Handler$1.run (Server.java:1428) at java.security.AccessController.doPrivileged (Native Method) at Javax.security.auth.Subject.doAs (Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1190) at org.apache.hadoop.ipc.Server$Handler.run (Server.java:1426) at org.apache.hadoop.ipc.Client.call (Client.java:1113) at org.apache.hadoop.ipc.RPC$Invoker.invoke (RPC.java:229) at com.sun.proxy.$Proxy1.mkdirs (Unknown Source) at sun .reflect.NativeMethodAccessorImpl.invoke0 (NativeMethod) at sun.reflect.NativeMethodAccessorImpl.invoke (Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke (Unknown Source) at java.lang.reflect.Method.invoke (Unknown Source) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:85) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:62) at com.sun.proxy.$Proxy1.mkdirs (Unknown Source) at org .apache.hadoop.hdfs.DFSClient.mkdirs (DFSClient.java:1457). 4 more14/08/21 21:43:14 INFO mapred.JobClient: map 0 reduce 0 take 08 Counters 21 21:43:14 INFO mapred.JobClient: Job complete: job_local1395395251_000114/08/21 21:43:14 INFO mapred.JobClient: Counters: 0
Or prompt permission ^ & ^ (% ^ &% R* ^ *) directly in a node in eclipse's DFS directory tree
Solution: add the following section to hdfs-site.xml. Causes dfs not to verify permissions.
The executive report in dfs.permissions false4.5 and Win7 cannot be found by winutils.exe.
The error is as follows:
2015-02-05 15 util.NativeCodeLoader 19displacement 43080 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java: (62))-Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable2015-02-05 1515 util.Shell ERROR [main] util.Shell (Shell.java:getWinUtilsPath (336))-Failed to locate the winutils binary in the hadoop binary pathjava.io.IOException: Could not locate executable D:\ Lab\ lib\ hadoop\ hadoop-2.4.1\ bin\ winutils.exe in the Hadoop binaries. At org.apache.hadoop.util.Shell.getQualifiedBinPath (Shell.java:318) at org.apache.hadoop.util.Shell.getWinUtilsPath (Shell.java:333) at org.apache.hadoop.util.Shell. (Shell.java:326) at org.apache.hadoop.util.StringUtils. (StringUtils.java:76) at org.apache.hadoop.security.Groups.parseStaticMapping (Groups.java:93) at org.apache.hadoop. Security.Groups. (Groups.java:77) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService (Groups.java:240) at org.apache.hadoop.security.UserGroupInformation.initialize (UserGroupInformation.java:255) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized (UserGroupInformation.java:232) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject (UserGroupInformation.java:718) at org.apache.hadoop.security.UserGroupInformation.getLoginUser ( UserGroupInformation.java:703) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser (UserGroupInformation.java:605) at org.apache.hadoop.fs.FileSystem$Cache$Key. (FileSystem.java:2554) at org.apache.hadoop.fs.FileSystem$Cache$Key. (FileSystem.java:2546) at org.apache.hadoop.fs.FileSystem$Cache.get (FileSystem.java:2412) at org.apache.hadoop.fs.FileSystem.get (FileSystem.java:368) At wukong.hadoop.tools.HdfsTool.rm (HdfsTool.java:26) at sample.FlowMR.main (FlowMR.java:21) 2015-02-05 15 at sample.FlowMR.main 19pur51141 INFO [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated (1009))-session.id is deprecated. Instead, use dfs.metrics.session-id2015-02-05 1515 INFO [main] jvm.JvmMetrics (JvmMetrics.java:init (76))-Initializing JVM Metrics with processName=JobTracker, sessionId=, "how to remotely execute MapReduce programs", that's all. Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.