java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.102:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514) 2014-08-23 05:43:33,968 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Abandoning BP-269011219-192.168.70.101-1408743309997:blk_1073741840_1016 2014-08-23 05:43:33,971 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.70.102:50010 2014-08-23 05:43:33,978 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.103:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
今天run wordcount的時候出現以下錯誤 本以為是slave01和02的DataNode有問題
但用jps檢查後發現兩個DataNode都有正常運作
後來發現是我重新安裝叢集時忘記用以下指令關閉slave的防火牆
service iptables stop
chkconfig iptables off
所以運作hadoop時不關閉防火牆是不行的
文章標籤
全站熱搜