java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.102:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514) 2014-08-23 05:43:33,968 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Abandoning BP-269011219-192.168.70.101-1408743309997:blk_1073741840_1016 2014-08-23 05:43:33,971 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Excluding datanode 192.168.70.102:50010 2014-08-23 05:43:33,978 INFO [Thread-54] org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink as 192.168.70.103:50010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
- Aug 23 Sat 2014 22:00
錯誤解決java.io.IOException: Bad connect ack with firstBadLink
- Aug 17 Sun 2014 14:39
實戰MapReduce -- Wordcount
經歷了先前不斷地幫Hadoop 架設環境和debug.. 終於要進入正題了
Wordcount是Hadoop裡面最經典的功能 source code在官網也有 今天就拿hadoop內建的Wordcount練習吧
- Aug 10 Sun 2014 00:03
Browse the filesystem 無法連結
- Aug 09 Sat 2014 23:27
HTTP ERROR 500 Problem accessing /nn_browsedfscontent.jsp. Reason:
- Aug 09 Sat 2014 23:23
java.io.IOException: Incompatible clusterIDs
使用start-all.sh常會發現DataNode消失
登入log來debug發下以下訊息
- Aug 07 Thu 2014 22:26
Server端 多叢集Hadoop安裝
由於作者很窮只有兩台爛爛的主機可以使用 所以這邊示範只會用一台master和一台slave做架設
工具和安裝方法都在 "Server端 單一叢集式Hadoop安裝" 這邊提過 我們直接跳到設定內往的地方開始
- Aug 05 Tue 2014 22:14
解決ssh: could not resolve hostname
使用start-all.sh啟動hadoop後發現有許多error訊息 雖然沒有影響到啟動 會影響什麼也還不清楚
查的結果是因為我使用的hadoop是官網提供的hadoop包 他們是compile成32bits的
- Aug 03 Sun 2014 14:47
Server端 單一叢集式Hadoop安裝