In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. There is no problem when I do transfomer, but as soon as I finish the action operation, I will report an error.
Here are a few small questions:
DistFile = sc.textFile ("hdfs://user/spark/test/201201.csv")
DistFile.map (lambda s: len (s)) .reduce (lambda a, b: a + b)
There is no problem in implementing Article 1, but an error will be reported when implementing Article 2:
Illegal character in scheme name at index 0: hdfs://user/spark/test/201201.csv
After looking for a long time, I found that I accidentally copied an extra space when writing the address, which led to this problem.
After correcting it and running it again, there is another error:
It probably includes net and so on, but I didn't keep it. Check it and change the address:
DistFile = sc.textFile ("hdfs://master:8020/user/spark/test/201201.csv")
Then another error was reported, pyspark.sql.utils.IllegalArgumentException: 'java.net.UnknownHostException: user'
After that, I checked some information that was not very useful, and suddenly I wondered why I must try the absolute path and the relative path. So after that:
DistFile = sc.textFile ("hdfs://user/spark/test/201201.csv")
This time there is no problem, normal implementation, the previous problem should be master:8020 this part, this can modify some different settings to try, should also be able to solve, in addition, try not to use absolute paths in the programming process, use more relative paths.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.