獲取hadoop版本關聯到eclipse中的安裝包

2/13/2017來源:經驗技巧人氣:3742

轉載自:http://blog.csdn.net/jiutianhe/article/details/39233609

我們如果想搞開發,研究源碼對我們的幫助很大。不明白原理就如同黑盒子,遇到問題,我們也摸不著思路。所以這里交給大家 一.如何獲取源碼 二.如何關聯源碼 一.如何獲取源碼 1.下載hadoop的maven程序包 (1)官網下載 這里我們先從官網上下載maven包hadoop-2.4.0-src.tar.gz。 官網下載地址 對于不知道怎么去官網下載,可以查看:新手指導:hadoop官網介紹及如何下載hadoop(2.4)各個版本與查看hadoop API介紹 (2)網盤下載 也可以從網盤下載: http://pan.baidu.com/s/1kToPuGB 2.通過maven獲取源碼 獲取源碼的方式有兩種,一種是通過命令行的方式,一種是通過eclipse。這里主要講通過命令的方式 通過命令的方式獲取源碼: 1.解壓包   解壓包的時候遇到了下面問題。不過不用管,我們繼續往下走
1        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-PRoject\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\classes\org\apache\hadoop\yarn\server\applicationhistoryservice\ApplicationHistoryClientService$ApplicationHSClientProtocolHandler.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 2        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\classes\org\apache\hadoop\yarn\server\applicationhistoryservice\timeline\LeveldbTimelineStore$LockMap$CountingReentrantLock.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 3        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-applicationhistoryservice\target\test-classes\org\apache\hadoop\yarn\server\applicationhistoryservice\webapp\TestAHSWebApp$MockApplicationHistoryManagerImpl.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 4        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\monitor\capacity\TestProportionalCapacityPreemptionPolicy$IsPreemptionRequestFor.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 5        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestFSRMStateStore$TestFSRMStateStoreTester$TestFileSystemRMStore.class:系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 6        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStore$TestZKRMStateStoreTester$TestZKRMStateStoreInternal.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 7        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStoreZKClientConnections$TestZKClient$TestForwardingWatcher.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 8        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\recovery\TestZKRMStateStoreZKClientConnections$TestZKClient$TestZKRMStateStore.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 9        : 無法創建文件:D:\hadoop2\hadoop-2.4.0-src\hadoop-yarn-project\hadoop-yarn\hadoop-yarn-server\hadoop-yarn-server-resourcemanager\target\test-classes\org\apache\hadoop\yarn\server\resourcemanager\rmapp\attempt\TestRMAppAttemptTransitions$TestApplicationAttemptEventDispatcher.class: 路徑和文件名總長度不能超過260個字符 系統找不到指定的路徑。        D:\hadoop2\hadoop-2.4.0-src.zip 2.通過maven獲取源碼 這里需要說明的是,在使用maven的時候,需要先安裝jdk,protoc ,如果沒有安裝可以參考Win7如何安裝maven、安裝protoc (1)進入hadoop-2.4.0-src\hadoop-maven-plugins,運行mvn install D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins>mvn install 復制代碼 顯示如下信息 [INFO] Scanning for projects... [WARNING] [WARNING] Some problems were encountered while building the effective model for org.apache.hadoop:hadoop-maven-plugins:maven-plugin:2.4.0 [WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found d uplicate declaration of plugin org.apache.maven.plugins:maven-enforcer-plugin @ org.apache.hadoop:hadoop-project:2.4.0, D:\hadoop2\hadoop-2.4.0-src\hadoop-proje ct\pom.xml, line 1015, column 15 [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten t he stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support buildin g such malformed projects. [WARNING] [INFO] [INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethrea ded.SingleThreadedBuilder with a thread count of 1 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop Maven Plugins 2.4.0 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-maven-plugins --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-plugin-plugin:3.0:descriptor (default-descriptor) @ hadoop-mave n-plugins --- [INFO] Using 'UTF-8' encoding to read mojo metadata. [INFO] Applying mojo extractor for language: java-annotations [INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors. [INFO] Applying mojo extractor for language: java [INFO] Mojo extractor for language: java found 0 mojo descriptors. [INFO] Applying mojo extractor for language: bsh [INFO] Mojo extractor for language: bsh found 0 mojo descriptors. [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-mav en-plugins --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-maven- plugins --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-plugin-plugin:3.0:descriptor (mojo-descriptor) @ hadoop-maven-p lugins --- [INFO] Using 'UTF-8' encoding to read mojo metadata. [INFO] Applying mojo extractor for language: java-annotations [INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors. [INFO] Applying mojo extractor for language: java [INFO] Mojo extractor for language: java found 0 mojo descriptors. [INFO] Applying mojo extractor for language: bsh [INFO] Mojo extractor for language: bsh found 0 mojo descriptors. [INFO] [INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ ha doop-maven-plugins --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoo p-maven-plugins --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hadoop-maven-plugins --- [INFO] No tests to run. [INFO] [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-maven-plugins --- [INFO] Building jar: D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\target\had oop-maven-plugins-2.4.0.jar [INFO] [INFO] --- maven-plugin-plugin:3.0:addPluginArtifactMetadata (default-addPluginA rtifactMetadata) @ hadoop-maven-plugins --- [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop- maven-plugins --- [INFO] [INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-maven-p lugins --- [INFO] Installing D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\target\hadoop -maven-plugins-2.4.0.jar to C:\Users\hyj\.m2\repository\org\apache\hadoop\hadoop -maven-plugins\2.4.0\hadoop-maven-plugins-2.4.0.jar [INFO] Installing D:\hadoop2\hadoop-2.4.0-src\hadoop-maven-plugins\pom.xml to C: \Users\hyj\.m2\repository\org\apache\hadoop\hadoop-maven-plugins\2.4.0\hadoop-ma ven-plugins-2.4.0.pom [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 4.891 s [INFO] Finished at: 2014-06-23T14:47:33+08:00 [INFO] Final Memory: 21M/347M [INFO] ------------------------------------------------------------------------ 復制代碼 部分截圖如下:     (2)運行 mvn eclipse:eclipse -DskipTests 復制代碼 這時候注意,我們進入的是hadoop_home,我這里是D:\hadoop2\hadoop-2.4.0-src 部分信息如下 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [  0.684 s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [  0.720 s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [  0.276 s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [  0.179 s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [  0.121 s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [  1.680 s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [  1.802 s] [INFO] Apache Hadoop Auth ................................ SUCCESS [  1.024 s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  0.160 s] [INFO] Apache Hadoop Common .............................. SUCCESS [  1.061 s] [INFO] Apache Hadoop NFS ................................. SUCCESS [  0.489 s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [  0.056 s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [  2.770 s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [  0.965 s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [  0.629 s] [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [  0.284 s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.061 s] [INFO] hadoop-yarn ....................................... SUCCESS [  0.052 s] [INFO] hadoop-yarn-api ................................... SUCCESS [  0.842 s] [INFO] hadoop-yarn-common ................................ SUCCESS [  0.322 s] [INFO] hadoop-yarn-server ................................ SUCCESS [  0.065 s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [  0.972 s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [  0.580 s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [  0.379 s] [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [  0.281 s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [  0.378 s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [  0.534 s] [INFO] hadoop-yarn-client ................................ SUCCESS [  0.307 s] [INFO] hadoop-yarn-applications .......................... SUCCESS [  0.050 s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [  0.202 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [  0.194 s] [INFO] hadoop-yarn-site .................................. SUCCESS [  0.057 s] [INFO] hadoop-yarn-project ............................... SUCCESS [  0.066 s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [  0.091 s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [  1.321 s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [  0.786 s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [  0.456 s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [  0.508 s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [  0.834 s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [  0.541 s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [  0.284 s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [  0.851 s] [INFO] hadoop-mapreduce .................................. SUCCESS [  0.099 s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [  0.742 s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [  0.335 s] [INFO] Apache Hadoop Archives ............................ SUCCESS [  0.397 s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [  0.371 s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [  0.230 s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [  0.184 s] [INFO] Apache Hadoop Extras .............................. SUCCESS [  0.217 s] [INFO] Apache Hadoop Pipes ............................... SUCCESS [  0.048 s] [INFO] Apache Hadoop OpenStack support ................... SUCCESS [  0.244 s] [INFO] Apache Hadoop Client .............................. SUCCESS [  0.590 s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [  0.230 s] [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [  0.650 s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [  0.334 s] [INFO] Apache Hadoop Tools ............................... SUCCESS [  0.042 s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [  0.144 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 31.234 s [INFO] Finished at: 2014-06-23T14:55:08+08:00 [INFO] Final Memory: 84M/759M [INFO] ------------------------------------------------------------------------ 復制代碼 這時候,我們已經把源碼給下載下來了。這時候,我們會看到文件會明顯增大。   3.關聯eclipse源碼 加入我們以下程序  hadoop2.2mapreduce例子.rar (1.14 MB, 下載次數: 68, 售價: 1 云幣)  如下圖示,對他們進行了打包   這兩個文件, MaxTemperature.zip為mapreduce例子,mockito-core-1.8.5.jar為mapreduce例子所引用的包 (這里需要說明的是,mapreduce為2.2,但是不影響關聯源碼,只是交給大家該如何關聯源碼) 我們解壓之后,導入eclipse (對于導入項目不熟悉,參考零基礎教你如何導入eclipse項目)   我們導入之后,看到很多的紅線,這些其實都是沒有引用包,下面我們開始解決這些語法問題。 一、解決導入jar包 (1)引入mockito-core-1.8.5.jar (2)hadoop2.4編譯包中的jar文件,這些文件的位置如下: hadoop_home中share\hadoop文件夾下,具體我這的位置D:\hadoop2\hadoop-2.4.0\share\hadoop 找到里面的jar包,舉例如下:lib文件中的jar包,以及下面的jar包都添加到buildpath中。 如果對于引用包,不知道該如何添加這些jar包,參考hadoop開發方式總結及操作指導。 (注意的是,我們這里是引入的是編譯包,編譯的下載hadoop--642.4.0.tar.gz 鏈接: http://pan.baidu.com/s/1c0vPjG0 密碼:xj6l) 更多包下載可以參考hadoop家族、strom、spark、linux、flume等jar包、安裝包匯總下載     二、關聯源碼 1.我們導入jar包之后,就沒有錯誤了,如下圖所示   2.找不到源碼 當我們想看一個類或則函數怎么實現的時候,通過Open Call Hierarchy,卻找不到源文件。     3.Attach Source   上面三處,我們按照順序添加即可,我們選定壓縮包之后,單擊確定,ok了,我們的工作已經完畢。 注意:對于hadoop-2.2.0-src.zip則是我們上面通過maven下載的源碼,然后壓縮的文件,記得一定是壓縮文件zip的形式 4.驗證關聯后查看源碼 我們再次執行上面操作,通過Open Call Hierarchy 看到下面內容   然后我們雙擊上圖主類,即紅字部分,我們看到下面內容:   問題: 細心的同學,這里面我們產生一個問題,因為我們看到的是.class文件,而不是.java文件。那么他會不會和我們所看到的.java文件不一樣那。 其實是一樣的,感興趣的同學,可以驗證一下。

欢乐捕鱼人弹头怎么买