Hadoop example leads to internal error: IOException in MWMCR.mclFevel

3 次查看(过去 30 天)
I am trying to get the Hadoop example working on a Hadoop cluster. When I run the job, I'm getting an `Error: java.io.IOException: com.mathworks.toolbox.javabuilder.internal.MWMCR.mclFeval(Native Method)`. Any idea what that could be or how to troubleshoot?
Here is more of the log:
java.library.path: /usr/lib/hadoop/lib/native
HDFSCTFPath=hdfs://cluster-for-cameron-m:8020/user/root/airlinesmall/airlinesmall.ctf
Uploading CTF into distributed cache completed.
16/05/20 06:13:06 INFO gcs.GoogleHadoopFileSystemBase: GHFS version: 1.4.5-hadoop2
tmpjars: file:///usr/local/MATLAB/MATLAB_Runtime/v90/toolbox/mlhadoop/jar/a2.2.0/mwmapreduce.jar
jar: file:///usr/local/MATLAB/MATLAB_Runtime/v90/toolbox/mlhadoop/jar/a2.2.0/mwmapreduce.jar
mapred.child.env: MCR_CACHE_ROOT=/tmp,LD_LIBRARY_PATH=/usr/local/MATLAB/MATLAB_Runtime/v90/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v90/bin/glnxa64
mapred.child.java.opts: -Xmx200m -Djava.library.path=/usr/local/MATLAB/MATLAB_Runtime/v90/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v90/bin/glnxa64
New java.library.path: /usr/lib/hadoop/lib/native:/usr/local/MATLAB/MATLAB_Runtime/v90/runtime/glnxa64:/usr/local/MATLAB/MATLAB_Runtime/v90/bin/glnxa64
Using MATLAB mapper.
Set input format class to: ChunkFileRecordReader.
Using Hadoop default reducer.
Set outputformat class to: class org.apache.hadoop.mapreduce.lib.output.TextOutputFormat
Set map output key class to: class org.apache.hadoop.io.Text
Set map output value class to: class org.apache.hadoop.io.Text
Set reduce output key class to: class org.apache.hadoop.io.LongWritable
Set reduce output value class to: class org.apache.hadoop.io.Text
*************** run ******************
16/05/20 06:13:06 INFO client.RMProxy: Connecting to ResourceManager at cluster-for-cameron-m/10.240.0.16:8032
16/05/20 06:13:07 INFO input.FileInputFormat: Total input paths to process : 1
16/05/20 06:13:07 INFO mapreduce.JobSubmitter: number of splits:1
16/05/20 06:13:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1463693383665_0014
16/05/20 06:13:07 INFO impl.YarnClientImpl: Submitted application application_1463693383665_0014
16/05/20 06:13:07 INFO mapreduce.Job: The url to track the job: http://cluster-for-cameron-m:8088/proxy/application_1463693383665_0014/
16/05/20 06:13:07 INFO mapreduce.Job: Running job: job_1463693383665_0014
16/05/20 06:13:13 INFO mapreduce.Job: Job job_1463693383665_0014 running in uber mode : false
16/05/20 06:13:13 INFO mapreduce.Job: map 0% reduce 0%
16/05/20 06:13:22 INFO mapreduce.Job: Task Id : attempt_1463693383665_0014_m_000000_0, Status : FAILED
Error: java.io.IOException: com.mathworks.toolbox.javabuilder.internal.MWMCR.mclFeval(Native Method)
com.mathworks.toolbox.javabuilder.internal.MWMCR.access$600(MWMCR.java:31)
com.mathworks.toolbox.javabuilder.internal.MWMCR$6.mclFeval(MWMCR.java:861)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
com.mathworks.toolbox.javabuilder.internal.MWMCR$5.invoke(MWMCR.java:759)
com.sun.proxy.$Proxy17.mclFeval(Unknown Source)
com.mathworks.toolbox.javabuilder.internal.MWMCR.invoke(MWMCR.java:427)
com.mathworks.hadoop.MWMapReduceDriver$MWMap.invokeMap(MWMapReduceDriver.java:454)
com.mathworks.hadoop.MWMapReduceDriver$MWMap.run(MWMapReduceDriver.java:401)
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
java.security.AccessController.doPrivileged(Native Method)
javax.security.auth.Subject.doAs(Subject.java:422)
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
at com.mathworks.hadoop.MWMapReduceDriver$MWMap.invokeMap(MWMapReduceDriver.java:473)
at com.mathworks.hadoop.MWMapReduceDriver$MWMap.run(MWMapReduceDriver.java:401)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

回答(1 个)

Cameron Taggart
Cameron Taggart 2016-5-20
编辑:Cameron Taggart 2016-5-20
This error occurred when I used `gs://matlab-build/airlinesmall.csv` instead of `hdfs:///airlinesmall.csv`. Is it possible to make the former work? Google provides a Storage Connector for Hadoop. It allows me to use `gs://` from other tools like the `hadoop fs` command. This works:
hadoop fs -cp gs://matlab-build/airlinesmall.csv hdfs:///

类别

Help CenterFile Exchange 中查找有关 Cluster Configuration 的更多信息

标签

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by