榪愯hadoop紼嬪簭鏃訛紝 涓旀垜鎶婂畠緇堟浜?jiǎn)锛岀劧鍚庡啀鍚慼dfs鍔犳枃浠舵垨鍒犻櫎鏂囦歡鏃訛紝鍑虹幇Name node is in safe mode閿欒錛?br />rmr: org.apache.hadoop.dfs.SafeModeException: Cannot delete /user/hadoop/input. Name node is in safe mode
瑙e喅鐨勫懡浠わ細(xì)
bin/hadoop dfsadmin -safemode leave #鍏抽棴safe mode
杞嚜錛?nbsp;http://shutiao2008.iteye.com/blog/318950
闄?瀹夊叏妯″紡 瀛︿範(fàn)錛?/p> safemode妯″紡
NameNode鍦ㄥ惎鍔ㄧ殑鏃跺欓鍏堣繘鍏ュ畨鍏ㄦā寮忥紝濡傛灉datanode涓㈠け鐨刡lock杈懼埌涓瀹氱殑姣斾緥錛?-dfs.safemode.threshold.pct錛夛紝鍒欑郴緇熶細(xì)涓鐩村浜庡畨鍏ㄦā寮忕姸鎬佸嵆鍙鐘舵併?br />dfs.safemode.threshold.pct錛堢己鐪佸?.999f錛夎〃紺篐DFS鍚姩鐨勬椂鍊欙紝濡傛灉DataNode涓婃姤鐨刡lock涓暟杈懼埌浜?jiǎn)鍏冩暟鎹褰曠殑block涓暟鐨?.999鍊嶆墠鍙互紱誨紑瀹夊叏妯″紡錛屽惁鍒欎竴鐩存槸榪欑鍙妯″紡銆傚鏋滆涓?鍒橦DFS姘歌繙鏄浜嶴afeMode銆?br />涓嬮潰榪欒鎽樺綍鑷狽ameNode鍚姩鏃剁殑鏃ュ織錛坆lock涓婃姤姣斾緥1杈懼埌浜?jiǎn)闃鍊?.9990錛?br />The ratio of reported blocks 1.0000 has reached the threshold 0.9990. Safe mode will be turned off automatically in 18 seconds.
hadoop dfsadmin -safemode leave
鏈変袱涓柟娉曠寮榪欑瀹夊叏妯″紡
1. 淇敼dfs.safemode.threshold.pct涓轟竴涓瘮杈冨皬鐨勫鹼紝緙虹渷鏄?.999銆?br />2. hadoop dfsadmin -safemode leave鍛戒護(hù)寮哄埗紱誨紑
http://bbs.hadoopor.com/viewthread.php?tid=61&extra=page%3D1
錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛?br />Safe mode is exited when the minimal replication condition is reached, plus an extension
time of 30 seconds. The minimal replication condition is when 99.9% of the blocks in
the whole filesystem meet their minimum replication level (which defaults to one, and
is set by dfs.replication.min).
瀹夊叏妯″紡鐨勯鍑哄墠鎻?- 鏁翠釜鏂囦歡緋葷粺涓殑99.9%錛堥粯璁ゆ槸99.9%錛屽彲浠ラ氳繃dfs.safemode.threshold.pct璁劇疆錛夌殑Blocks杈懼埌鏈灝忓浠界駭鍒?榛樿鏄?錛屽彲浠ラ氳繃dfs.replication.min璁劇疆)銆?br />dfs.safemode.threshold.pct float 0.999
The proportion of blocks in the system that must meet the minimum
replication level defined by dfs.rep lication.min before the namenode
will exit safe mode. Setting
this value to 0 or less forces the name-node not to start in safe mode.
Setting this value to more than 1 means the namenode never exits safe
mode.
錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛嶏紞錛?br />鐢ㄦ埛鍙互閫氳繃dfsadmin -safemode value 鏉ユ搷浣滃畨鍏ㄦā寮忥紝鍙傛暟value鐨勮鏄庡涓嬶細(xì)
enter - 榪涘叆瀹夊叏妯″紡
leave - 寮哄埗NameNode紱誨紑瀹夊叏妯″紡
get - 榪斿洖瀹夊叏妯″紡鏄惁寮鍚殑淇℃伅
wait - 絳夊緟錛屼竴鐩村埌瀹夊叏妯″紡緇撴潫銆?/div>
嫻佺▼濡備笅錛?/p>
1.涓嬭澆hadoop 1.0.3 錛坔ttp://hadoop.apache.org/releases.html#Download錛夛紝瑙e帇鍦ㄨ嚜瀹氫箟鐨勪竴涓洰褰曚腑錛堟渶濂藉叏鑻辨枃璺緞錛岃瘯榪囦腑鏂囪礬寰勫嚭浜?jiǎn)闂锛?jí)銆?/p>
2.Eclipse瀵煎叆..\hadoop-1.0.3\src\contrib\eclipse-plugin欏圭洰錛岄粯璁ら」鐩槸MapReduceTools銆?/p>
3. 鍦ㄩ」鐩甅apReduceTools涓柊寤簂ib鐩綍錛屽茍鎶奾adoop鐨刪adoop-core錛堢敱hadoop鏍圭洰褰曠殑hadoop-*.jar鏀瑰悕 鑾峰緱錛夈乧ommons-cli-1.2.jar銆乧ommons-lang-2.4.jar銆乧ommons-configuration- 1.6.jar銆乯ackson-mapper-asl-1.8.8.jar銆乯ackson-core-asl-1.8.8.jar銆乧ommons- httpclient-3.0.1.jar鎷瘋礉鍒拌鐩綍銆?/p>
4.淇敼涓婄駭鐩綍涓殑build-contrib.xml錛?/p>
鎵懼埌<property name="hadoop.root" location="${root}/../../../"/>淇敼location涓篽adoop1.0.3瀹為檯瑙e帇鐩綍錛屽湪鍏朵笅娣誨姞
<property name="eclipse.home" location="D:/Program Files/eclipse"/>
<property name="version" value="http://x-goder.iteye.com/blog/1.0.3"/>
5.淇敼欏圭洰鐩綍涓嬬殑build.xml錛?/p>
<target name="jar" depends="compile" unless="skip.contrib">
<mkdir dir="${build.dir}/lib"/>
<copy file="${hadoop.root}/hadoop-core-${version}.jar" tofile="${build.dir}/lib/hadoop-core.jar" verbose="true"/>
<copy file="${hadoop.root}/lib/commons-cli-1.2.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.root}/lib/commons-lang-2.4.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.root}/lib/commons-configuration-1.6.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.root}/lib/jackson-mapper-asl-1.8.8.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.root}/lib/jackson-core-asl-1.8.8.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.root}/lib/commons-httpclient-3.0.1.jar" todir="${build.dir}/lib" verbose="true"/>
<jar
jarfile="${build.dir}/hadoop-${name}-${version}.jar"
manifest="${root}/META-INF/MANIFEST.MF">
<fileset dir="${build.dir}" includes="classes/ lib/"/>
<fileset dir="${root}" includes="resources/ plugin.xml"/>
</jar>
</target>
6.鍙抽敭eclipse閲岀殑build.xml閫夋嫨run as - ant build銆?/p>
濡傛灉鍑虹幇錛?#8220;杞歡鍖卭rg.apache.hadoop.fs 涓嶅瓨鍦?#8221;鐨勯敊璇垯淇敼build.xml錛?/p>
<path id="hadoop-jars">
<fileset dir="${hadoop.root}/">
<include name="hadoop-*.jar"/>
</fileset>
</path>
鍦?lt;path id="classpath">涓坊鍔狅細(xì)<path refid="hadoop-jars"/>
7.絳堿nt緙栬瘧瀹屾瘯鍚庛傜紪璇戝悗鐨勬枃浠跺湪錛歕build\contrib 涓殑 hadoop-eclipse-plugin-1.0.3.jar銆?/p>
8.鏌ョ湅緙栬瘧濂界殑jar鍖呬笅META-INF/MANIFEST.MF 涓嬬殑閰嶇疆灞炴ф槸鍚﹀畬鏁達(dá)紝濡傛灉涓嶅畬鏁達(dá)紝琛ュ厖瀹屾暣銆?/p>
9.鏀懼叆eclipse/plugins涓嬶紝閲嶅惎eclipse錛屾煡鐪嬫槸鍚﹀畨瑁呮垚鍔熴?/p>
鎻掍歡
璇濊Hadoop 1.0.2/src/contrib/eclipse-plugin鍙湁鎻掍歡鐨勬簮浠g爜錛岃繖閲岀粰鍑轟竴涓垜鎵撳寘濂界殑瀵瑰簲鐨凟clipse鎻掍歡錛?br />涓嬭澆鍦板潃
涓嬭澆鍚庢墧鍒癳clipse/dropins鐩綍涓嬪嵆鍙紝褰撶劧eclipse/plugins涔熸槸鍙互鐨勶紝鍓嶈呮洿涓鴻交渚匡紝鎺ㄨ崘錛涢噸鍚疎clipse錛屽嵆鍙湪閫忚鍥?Perspective)涓湅鍒癕ap/Reduce銆?/p>
閰嶇疆
鐐瑰嚮钃濊壊鐨勫皬璞″浘鏍囷紝鏂板緩涓涓狧adoop榪炴帴錛?/p>
娉ㄦ剰錛屼竴瀹氳濉啓姝g‘錛屼慨鏀逛簡(jiǎn)鏌愪簺绔彛錛屼互鍙?qiáng)榛樿杩愯鐨勭敤鎴峰悕绛?/p>
鍏蜂綋鐨勮緗紝鍙
姝e父鎯呭喌涓嬶紝鍙互鍦ㄩ」鐩尯鍩熷彲浠ョ湅鍒?/p>
榪欐牱鍙互姝e父鐨勮繘琛孒DFS鍒嗗竷寮忔枃浠剁郴緇熺殑綆$悊錛氫笂浼狅紝鍒犻櫎絳夋搷浣溿?/p>
涓轟笅闈㈡祴璇曞仛鍑嗗錛岄渶瑕佸厛寤轟簡(jiǎn)涓涓洰褰?user/root/input2錛岀劧鍚庝笂浼犱袱涓猼xt鏂囦歡鍒版鐩綍錛?/p>
intput1.txt 瀵瑰簲鍐呭錛欻ello Hadoop Goodbye Hadoop
intput2.txt 瀵瑰簲鍐呭錛欻ello World Bye World
HDFS鐨勫噯澶囧伐浣滃ソ浜?jiǎn)锛屼笅闈㈠彲浠ュ紑濮嬫祴璇曚簡(jiǎn)銆?/p>
Hadoop宸ョ▼
鏂板緩涓涓狹ap/Reduce Project宸ョ▼錛岃瀹氬ソ鏈湴鐨刪adoop鐩綍
鏂板緩涓涓祴璇曠被WordCountTest錛?/p>
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 |
|
鍙抽敭錛岄夋嫨“Run Configurations”,寮瑰嚭紿楀彛錛岀偣鍑?#8220;Arguments”閫夐」鍗?鍦?#8220;Program argumetns”澶勯鍏堣緭鍏ュ弬鏁?
hdfs://master:9000/user/root/input2 dfs://master:9000/user/root/output2
澶囨敞錛氬弬鏁頒負(fù)浜?jiǎn)鍦ㄦ湰鍦拌皟璇曚娇鐢ㄥQ岃岄潪鐪熷疄鐜銆?/p>
鐒跺悗錛岀偣鍑?#8220;Apply”錛岀劧鍚?#8220;Close”銆傜幇鍦ㄥ彲浠ュ彸閿紝閫夋嫨“Run on Hadoop”錛岃繍琛屻?/p>
浣嗘鏃朵細(xì)鍑虹幇綾諱技寮傚父淇℃伅錛?/p>
12/04/24 15:32:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/04/24 15:32:44 ERROR security.UserGroupInformation: PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:655)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at com.hadoop.learn.test.WordCountTest.main(WordCountTest.java:85)
榪欎釜鏄疻indows涓嬫枃浠舵潈闄愰棶棰橈紝鍦↙inux涓嬪彲浠ユ甯歌繍琛岋紝涓嶅瓨鍦ㄨ繖鏍風(fēng)殑闂銆?/p>
瑙e喅鏂規(guī)硶鏄紝淇敼/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java閲岄潰鐨刢heckReturnValue錛屾敞閲婃帀鍗沖彲錛堟湁浜涚矖鏆達(dá)紝鍦╓indow涓嬶紝鍙互涓嶇敤媯(gè)鏌ワ級(jí)錛?/p>
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
閲嶆柊緙栬瘧鎵撳寘hadoop-core-1.0.2.jar錛屾浛鎹㈡帀hadoop-1.0.2鏍圭洰褰曚笅鐨刪adoop-core-1.0.2.jar鍗沖彲銆?/p>
榪欓噷鎻愪緵涓浠戒慨鏀圭増鐨?a target="_blank">hadoop-core-1.0.2-modified.jar鏂囦歡錛屾浛鎹㈠師hadoop-core-1.0.2.jar鍗沖彲銆?/p>
鏇挎崲涔嬪悗錛屽埛鏂伴」鐩紝璁劇疆濂芥紜殑jar鍖呬緷璧栵紝鐜板湪鍐嶈繍琛學(xué)ordCountTest錛屽嵆鍙?/p>
鎴愬姛涔嬪悗錛屽湪Eclipse涓嬪埛鏂癏DFS鐩綍錛屽彲浠ョ湅鍒扮敓鎴愪簡(jiǎn)ouput2鐩綍錛?/p>
鐐瑰嚮“ part-r-00000”鏂囦歡錛屽彲浠ョ湅鍒版帓搴忕粨鏋滐細(xì)
Bye 1
Goodbye 1
Hadoop 2
Hello 2
World 2
鍡紝涓鏍峰彲浠ユ甯窪ebug璋冭瘯璇ョ▼搴忥紝璁劇疆鏂偣錛堝彸閿?–> Debug As – > Java Application錛夛紝鍗沖彲錛堟瘡嬈¤繍琛屼箣鍓嶏紝閮介渶瑕佹敹鍒板垹闄よ緭鍑虹洰褰曪級(jí)銆?/p>
鍙﹀錛岃鎻掍歡浼?xì)鍦╡clipse瀵瑰簲鐨剋orkspace\.metadata\.plugins\org.apache.hadoop.eclipse涓嬶紝鑷姩鐢熸垚jar鏂囦歡錛屼互鍙?qiáng)鍏朵粬鏂囦挥灱屽寘鎷琀aoop鐨勪竴浜涘叿浣撻厤緗瓑銆?/p>
鍡紝鏇村緇嗚妭錛屾參鎱綋楠屽惂銆?/p>
閬囧埌鐨勫紓甯?/strong>
org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/root/output2/_temporary. Name node is in safe mode.
The ratio of reported blocks 0.5000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2055)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
鍦ㄤ富鑺傜偣澶勶紝鍏抽棴鎺夊畨鍏ㄦā寮忥細(xì)
#bin/hadoop dfsadmin –safemode leave
濡備綍鎵撳寘
灝嗗垱寤虹殑Map/Reduce欏圭洰鎵撳寘鎴恓ar鍖咃紝寰堢畝鍗曠殑浜嬫儏錛屾棤闇澶氳█銆備繚璇乯ar鏂囦歡鐨凪ETA-INF/MANIFEST.MF鏂囦歡涓瓨鍦∕ain-Class鏄犲皠錛?/p>
Main-Class: com.hadoop.learn.test.TestDriver
鑻ヤ嬌鐢ㄥ埌絎笁鏂筳ar鍖咃紝閭d箞鍦∕ANIFEST.MF涓鍔燙lass-Path濂戒簡(jiǎn)銆?/p>
鍙﹀鍙嬌鐢ㄦ彃浠舵彁渚涚殑MapReduce Driver鍚戝錛屽彲浠ュ府蹇欐垜浠湪Hadoop涓繍琛岋紝鐩存帴鎸囧畾鍒悕錛屽挨鍏舵槸鍖呭惈澶氫釜Map/Reduce浣滀笟鏃訛紝寰堟湁鐢ㄣ?/p>
涓涓狹apReduce Driver鍙鍖呭惈涓涓猰ain鍑芥暟錛屾寚瀹氬埆鍚嶏細(xì)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
|
榪欓噷鏈変竴涓皬鎶宸э紝MapReduce Driver綾諱笂闈紝鍙抽敭榪愯錛孯un on Hadoop錛屼細(xì)鍦‥clipse鐨剋orkspace\.metadata\.plugins\org.apache.hadoop.eclipse鐩?褰曚笅鑷姩鐢熸垚jar鍖咃紝涓婁紶鍒癏DFS錛屾垨鑰呰繙紼媓adoop鏍圭洰褰曚笅錛岃繍琛屽畠:
# bin/hadoop jar LearnHadoop_TestDriver.java-460881982912511899.jar testcount input2 output3
OK錛屾湰鏂囩粨鏉熴?/p>
銆銆緇忚繃涓婁竴綃囩殑鍒嗘瀽錛屾垜浠煡閬撲簡(jiǎn)Hadoop鐨勪綔涓氭彁浜ょ洰鏍囨槸Cluster榪樻槸Local錛屼笌conf鏂囦歡澶瑰唴鐨勯厤緗枃浠跺弬鏁版湁鐫瀵嗗垏鍏崇郴錛屼笉浠呭姝わ紝鍏跺畠鐨勫緢澶氱被閮借窡conf鏈夊叧錛屾墍浠ユ彁浜や綔涓氭椂鍒囪鎶奵onf鏀懼埌浣犵殑classpath涓?/p>
銆銆鍥犱負(fù)Configuration鏄埄鐢ㄥ綋鍓嶇嚎紼嬩笂涓嬫枃鐨勭被鍔犺澆鍣ㄦ潵鍔犺澆璧勬簮鍜屾枃浠剁殑錛屾墍浠ヨ繖閲屾垜浠噰鐢ㄥ姩鎬佽澆鍏ョ殑鏂瑰紡錛屽厛娣誨姞濂藉搴旂殑渚濊禆搴撳拰璧勬簮錛岀劧鍚庡啀鏋勫緩涓涓猆RLClassLoader浣滀負(fù)褰撳墠綰跨▼涓婁笅鏂囩殑綾誨姞杞藉櫒銆?/p>
public static ClassLoader getClassLoader() {
ClassLoader parent = Thread.currentThread().getContextClassLoader();
if (parent == null) {
parent = EJob.class.getClassLoader();
}
if (parent == null) {
parent = ClassLoader.getSystemClassLoader();
}
return new URLClassLoader(classPath.toArray(new URL[0]), parent);
}
銆銆浠g爜寰堢畝鍗曪紝搴熻瘽灝變笉澶氳浜?jiǎn)銆傝皟鐢ㄤ緥瀛愬涓嬶細(xì)
EJob.addClasspath("/usr/lib/hadoop-0.20/conf");
ClassLoader classLoader = EJob.getClassLoader();
Thread.currentThread().setContextClassLoader(classLoader);
銆銆璁劇疆濂戒簡(jiǎn)綾誨姞杞藉櫒錛屼笅闈㈣繕鏈変竴姝ュ氨鏄鎵撳寘Jar鏂囦歡錛屽氨鏄Project鑷墦鍖呰嚜宸辯殑class涓轟竴涓狫ar鍖咃紝鎴戣繖閲屼互鏍囧噯Eclipse宸ョ▼鏂囦歡澶瑰竷灞涓轟緥錛屾墦鍖呯殑灝辨槸bin鏂囦歡澶歸噷鐨刢lass銆?/p>
public static File createTempJar(String root) throws IOException {
if (!new File(root).exists()) {
return null;
}
Manifest manifest = new Manifest();
manifest.getMainAttributes().putValue("Manifest-Version", "1.0");
final File jarFile = File.createTempFile("EJob-", ".jar", new File(System
.getProperty("java.io.tmpdir")));
Runtime.getRuntime().addShutdownHook(new Thread() {
public void run() {
jarFile.delete();
}
});
JarOutputStream out = new JarOutputStream(new FileOutputStream(jarFile),
manifest);
createTempJarInner(out, new File(root), "");
out.flush();
out.close();
return jarFile;
}
private static void createTempJarInner(JarOutputStream out, File f,
String base) throws IOException {
if (f.isDirectory()) {
File[] fl = f.listFiles();
if (base.length() > 0) {
base = base + "/";
}
for (int i = 0; i < fl.length; i++) {
createTempJarInner(out, fl[i], base + fl[i].getName());
}
} else {
out.putNextEntry(new JarEntry(base));
FileInputStream in = new FileInputStream(f);
byte[] buffer = new byte[1024];
int n = in.read(buffer);
while (n != -1) {
out.write(buffer, 0, n);
n = in.read(buffer);
}
in.close();
}
}
銆銆榪欓噷鐨勫澶栨帴鍙f槸createTempJar錛屾帴鏀跺弬鏁頒負(fù)闇瑕佹墦鍖呯殑鏂囦歡澶規(guī)牴璺緞錛屾敮鎸佸瓙鏂囦歡澶規(guī)墦鍖呫備嬌鐢ㄩ掑綊澶勭悊娉曪紝渚濇鎶婃枃浠跺す閲岀殑緇撴瀯鍜?鏂囦歡鎵撳寘鍒癑ar閲屻傚緢綆鍗曪紝灝辨槸鍩烘湰鐨勬枃浠舵祦鎿嶄綔錛岄檶鐢熶竴鐐圭殑灝辨槸Manifest鍜孞arOutputStream錛屾煡鏌PI灝辨槑浜?jiǎn)銆?/p>
銆銆濂斤紝涓囦簨鍏峰錛屽彧嬈犱笢椋庝簡(jiǎn)錛屾垜浠潵瀹炶返涓涓嬭瘯璇曘傝繕鏄嬁WordCount鏉ヤ婦渚嬶細(xì)
// Add these statements. XXX
File jarFile = EJob.createTempJar("bin");
EJob.addClasspath("/usr/lib/hadoop-0.20/conf");
ClassLoader classLoader = EJob.getClassLoader();
Thread.currentThread().setContextClassLoader(classLoader);
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf, args)
.getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: wordcount <in> <out>");
System.exit(2);
}
Job job = new Job(conf, "word count");
job.setJarByClass(WordCountTest.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
銆銆Run as Java Application銆傘傘?span style="color: #ff0000;">錛侊紒錛?/span>No job jar file set...寮傚父錛岀湅鏉?/span>job.setJarByClass(WordCountTest.class)榪欎釜璇彞璁劇疆浣滀笟Jar鍖呮病鏈夋垚鍔熴傝繖鏄負(fù)浠涔堝憿錛?/p> 鍥犱負(fù)榪欎釜鏂規(guī)硶浣跨敤浜?jiǎn)WordCount.class鐨勭被鍔犺澆鍣ㄦ潵瀵繪壘鍖呭惈璇ョ被鐨凧ar鍖咃紝鐒跺悗璁劇疆璇ar鍖呬負(fù)浣滀笟鎵鐢ㄧ殑Jar鍖呫備絾鏄垜浠殑浣滀笟 Jar鍖呮槸鍦ㄧ▼搴忚繍琛屾椂鎵嶆墦鍖呯殑錛岃學(xué)ordCount.class鐨勭被鍔犺澆鍣ㄦ槸AppClassLoader錛岃繍琛屽悗鎴戜滑鏃犳硶鏀瑰彉瀹冪殑鎼滅儲(chǔ)璺緞錛屾墍浠ヤ嬌 鐢╯etJarByClass鏄棤娉曡緗綔涓欽ar鍖呯殑銆傛垜浠繀欏諱嬌鐢↗obConf閲岀殑setJar鏉ョ洿鎺ヨ緗綔涓欽ar鍖咃紝鍍忎笅闈竴鏍鳳細(xì) 銆銆濂斤紝鎴戜滑瀵逛笂闈㈢殑渚嬪瓙鍐嶅仛涓嬩慨鏀癸紝鍔犱笂涓婇潰榪欐潯璇彞銆?/p> 銆銆鍐峈un as Java Application錛岀粓浜嶰K浜?jiǎn)~~ 銆銆璇ョ鏂規(guī)硶鐨凴un on Hadoop浣跨敤綆鍗曪紝鍏煎鎬уソ錛屾帹鑽愪竴璇曘傦細(xì)錛?/p> 銆銆鏈緥瀛愮敱浜庢椂闂村叧緋伙紝鍙湪Ubuntu涓婂仛浜?jiǎn)浼垎甯冨紡娴嬭瘯锛屼絾鐞嗚Z笂鏄彲浠ョ敤鍒扮湡瀹炲垎甯冨紡涓婂幓鐨勩?/p> 銆銆The end.
// And add this statement. XXX
((JobConf) job.getConfiguration()).setJar(jarFile.toString());