今天使用mahout時出現以下exception
java.net.ConnectException: Call From master01/192.168.70.101 to master01:10020
今天使用mahout時出現以下exception
java.net.ConnectException: Call From master01/192.168.70.101 to master01:10020
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
裝了Hadoop 2.6.0後一直會出現這個Warning 本來以為可以忽略 但是Sqoop在啟動的時候會有問題 所以還是要想辦法解決
java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.102:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514) 2014-08-23 05:43:33,968 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Abandoning BP-269011219-192.168.70.101-1408743309997:blk_1073741840_1016 2014-08-23 05:43:33,971 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.70.102:50010 2014-08-23 05:43:33,978 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.103:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
使用start-all.sh常會發現DataNode消失
登入log來debug發下以下訊息
由於作者很窮只有兩台爛爛的主機可以使用 所以這邊示範只會用一台master和一台slave做架設
工具和安裝方法都在 "Server端 單一叢集式Hadoop安裝" 這邊提過 我們直接跳到設定內往的地方開始
使用start-all.sh啟動hadoop後發現有許多error訊息 雖然沒有影響到啟動 會影響什麼也還不清楚
查的結果是因為我使用的hadoop是官網提供的hadoop包 他們是compile成32bits的