Я использую кластер HDP из 10 узлов, где я пытаюсь запустить простое задание WordCount с помощью сценария оболочки на Bash.Below - это аргументы строки Commmand, которые я использую.
yarn jar /usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming-2.7.3.2.6.5.0-292.jar \
-mapper 'wc -l' \
-reducer './reducer_wordcount.sh' \
-file /home/pathirippilly/map_reduce_jobs/shell_scripts/reducer_wordcount.sh \
-numReduceTasks 1 \
-input /user/pathirippilly/cards/smalldeck.txt \
-output /user/pathirippilly/mapreduce_jobs/output_shell
- Здесь reducer_wordcount.sh - сценарий оболочки редуктора, который доступен в
мой локальный каталог / home / pathirippilly / map_reduce_jobs / shell_scripts
- smalldeck.txt - это входной файл в каталоге hadoop / user / pathirippilly / cards
- / user / pathirippilly / mapreduce_jobs / output_shell - это выходной каталог
- Используемая мной версия hadoop - Hadoop 2.7.3.2.6.5.0-292
- Я запускаю приведенную выше карту уменьшения задания в режиме пряжи
reducer_wordcount.sh имеет:
#! /user/bin/env bash
awk '{line_count += $1} END { print line_count }'
Когда я запускаю это на своем кластере, я получаю ниже ошибку для reducer_wordcount.sh
Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:410)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
... 9 more
Caused by: java.lang.RuntimeException: configuration exception
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:222)
at org.apache.hadoop.streaming.PipeReducer.configure(PipeReducer.java:67)
... 14 more
Caused by: java.io.IOException: Cannot run program "/hdp01/hadoop/yarn/local/usercache/pathirippilly/appcache/application_1533622723243_17238/container_e38_1533622723243_17238_01_000004/./reducer_wordcount.sh": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:209)
... 15 more
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
Если я запускаю тот же скрипт-редуктор, что и запятая командной строки, как показано ниже, он работает
yarn jar /usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming.jar \
-mapper 'wc -l' \
-reducer "awk '{line_count += \$1} END { print line_count }'" \
-numReduceTasks 1 \
-input /user/pathirippilly/cards/smalldeck.txt \
-output /user/pathirippilly/mapreduce_jobs/output_shell
Ожидая помощи здесь, я довольно плохо знаком с потоковой передачей данных.
Полный стек ошибок приведен ниже:
18/09/09 10:10:02 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.
packageJobJar: [reducer_wordcount.sh] [/usr/hdp/2.6.5.0-292/hadoop-mapreduce/hadoop-streaming-2.7.3.2.6.5.0-292.jar] /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir/streamjob8506373101127930734.jar tmpDir=null
18/09/09 10:10:03 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/09/09 10:10:03 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/09/09 10:10:03 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/09/09 10:10:03 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/09/09 10:10:05 INFO mapred.FileInputFormat: Total input paths to process : 1
18/09/09 10:10:06 INFO mapreduce.JobSubmitter: number of splits:2
18/09/09 10:10:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533622723243_17238
18/09/09 10:10:08 INFO impl.YarnClientImpl: Submitted application application_1533622723243_17238
18/09/09 10:10:08 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19288/proxy/application_1533622723243_17238/
18/09/09 10:10:08 INFO mapreduce.Job: Running job: job_1533622723243_17238
18/09/09 10:10:14 INFO mapreduce.Job: Job job_1533622723243_17238 running in uber mode : false
18/09/09 10:10:14 INFO mapreduce.Job: map 0% reduce 0%
18/09/09 10:10:19 INFO mapreduce.Job: map 100% reduce 0%
18/09/09 10:10:23 INFO mapreduce.Job: Task Id : attempt_1533622723243_17238_r_000000_0, Status : FAILED
Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:410)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
... 9 more
Caused by: java.lang.RuntimeException: configuration exception
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:222)
at org.apache.hadoop.streaming.PipeReducer.configure(PipeReducer.java:67)
... 14 more
Caused by: java.io.IOException: Cannot run program "/hdp01/hadoop/yarn/local/usercache/pathirippilly/appcache/application_1533622723243_17238/container_e38_1533622723243_17238_01_000004/./reducer_wordcount.sh": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:209)
... 15 more
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 16 more