CDH版本的hadoop编译源代码mvn eclipse:eclipse报错

[root@zhumac1 hadoop-mapreduce-client]# mvn eclipse:eclipse
[INFO] Scanning for projects...
Downloading: https://repository.cloudera.co ... 8.pom
Downloading: http://repository.jboss.org/ne ... 8.pom
Downloading: http://DX2:8081/nexus/content/ ... 8.pom
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: javax.servlet.jsp:jsp-api:jar -> duplicate declaration of version 2.1 @ org.apache.hadoop:hadoop-project:2.6.0-cdh5.4.8, /root/hadoop-cdh/hadoop-project/pom.xml, line 588, column 19
[WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin org.apache.maven.plugins:maven-enforcer-plugin @ org.apache.hadoop:hadoop-project:2.6.0-cdh5.4.8, /root/hadoop-cdh/hadoop-project/pom.xml, line 1173, column 15
[FATAL] Non-resolvable parent POM for org.apache.hadoop:hadoop-main:2.6.0-cdh5.4.8: Could not transfer artifact com.cloudera.cdh:cdh-root:pom:5.4.8 from/to cdh.repo (https://repository.cloudera.co ... -repos): java.security.ProviderException: java.security.KeyException and 'parent.relativePath' points at wrong local POM @ org.apache.hadoop:hadoop-main:2.6.0-cdh5.4.8, /root/hadoop-cdh/pom.xml, line 19, column 11
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project org.apache.hadoop:hadoop-mapreduce-client:2.6.0-cdh5.4.8 (/root/hadoop-cdh/hadoop-mapreduce-project/hadoop-mapreduce-client/pom.xml) has 1 error
[ERROR] Non-resolvable parent POM for org.apache.hadoop:hadoop-main:2.6.0-cdh5.4.8: Could not transfer artifact com.cloudera.cdh:cdh-root:pom:5.4.8 from/to cdh.repo (https://repository.cloudera.co ... -repos): java.security.ProviderException: java.security.KeyException and 'parent.relativePath' points at wrong local POM @ org.apache.hadoop:hadoop-main:2.6.0-cdh5.4.8, /root/hadoop-cdh/pom.xml, line 19, column 11 -> [Help 2]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/conflu ... ption
[ERROR] [Help 2] http://cwiki.apache.org/conflu ... ption
[root@zhumac1 hadoop-mapreduce-client]#
已经下载好了CDH版本的源代码,想编译后导入到eclipse中,但是和昨天的apache版本一样,还是报错了,请老师帮忙看看。

fish - Hadooper

赞同来自: zhudejun_1985

这里:
1476459544213.png
 

fish - Hadooper

赞同来自: zhudejun_1985

请切换到v250 branch中。 执行的是./configure 而不是 ./configure.ac。

fish - Hadooper

赞同来自:

网络问题,从cdh官网下载依赖包失败。如果在训练营的机器中,将cdh mirror指向dx2的nexus。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

maven当初我设置过,那个已经会了,请问老师cdh mirror具体如何设置?哪个录播里面有具体讲到?谢谢。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

冼老师,还有个问题,修改了settings.xml已经可以进行编译了,大概进行了十几分钟后依然报错如下: Downloaded: http://DX2:8081/nexus/content/ ... p.jar (461 KB at 15.7 KB/sec) [WARNING] [protoc, --version] failed: java.io.IOException: Cannot run program "protoc": error=2, No such file or directory [ERROR] stdout: [] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO]  [INFO] hadoop-mapreduce-client ............................ SUCCESS [09:36 min] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [01:23 min] [INFO] hadoop-mapreduce-client-common ..................... FAILURE [01:11 min] [INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED [INFO] hadoop-mapreduce-client-app ........................ SKIPPED [INFO] hadoop-mapreduce-client-hs ......................... SKIPPED [INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED [INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED [INFO] hadoop-mapreduce-client-nativetask ................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 15:54 min [INFO] Finished at: 2016-10-15T10:25:19+08:00 [INFO] Final Memory: 38M/95M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.6.0-cdh5.4.8:protoc (compile-protoc) on project hadoop-mapreduce-client-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1] [ERROR]  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]  [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/conflu ... ption [ERROR]  [ERROR] After correcting the problems, you can resume the build with the command [ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-common [root@zhumac1 hadoop-mapreduce-client]#    请问老师,修改配置之后,必须还要把forest 0.9和rpmdevtools安装好才行吗?   关于rpmdevtools已经知道了是用yun install rpmdevtools的命令。那么forest 0.9呢? 我已经在网上那个下载了apache-forrest-0.9-sources.tar.gz  和  apache-forrest-0.9-dependencies.tar.gz , 这两个压缩包具体如何安装呢?有何依赖关系?  

fish - Hadooper

赞同来自:

得装protobuf(注意需要2.5版本的),你是不是视频都没看?参考CDH编译的视频(或者ppt)。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

冼老师,已经通过gitlab下载到了小象的protobuf,目前目录结构是这样的: /root/protobuf [root@zhumac1 protobuf]# ls Android.mk        generate_descriptor_proto.sh  protobuf.pc.in autogen.sh        INSTALL.txt                   protoc-artifacts benchmarks        java                          python CHANGES.txt       javanano                      README config.h.include  LICENSE                       README.md configure.ac      m4                            ruby conformance       Makefile.am                   src CONTRIBUTORS.txt  more_tests                    travis.sh csharp            objectivec                    vsprojects editors           post_process_dist.sh examples          protobuf-lite.pc.in [root@zhumac1 protobuf]#  接下来请问依次执行./configure.ac  ,  make  ,  make install吗? 我看这里面只有一个configure.ac的脚本文件,所谓的configure 操作就是指代这个吗?

zhudejun_1985 - 天下大事,必作于细

赞同来自:

老师,已经mvn eclipse:eclipse成功了,多谢。   还有一个问题,我之前用apache hadoop版本,在目录下面是这个结构: [root@zhumac1 hadoop-2.6.2]# ll total 132 drwxr-xr-x 2 10011 10011  4096 10月 12 18:28 bin drwxr-xr-x 3 10011 10011  4096 10月 22 2015 etc drwxr-xr-x 3 root  root   4096 9月   2 20:21 hdfs drwxr-xr-x 2 10011 10011  4096 10月 22 2015 include drwxr-xr-x 3 10011 10011  4096 10月 22 2015 lib drwxr-xr-x 2 10011 10011  4096 10月 22 2015 libexec -rw-r--r-- 1 10011 10011 15429 10月 22 2015 LICENSE.txt drwxr-xr-x 2 root  root   4096 10月 11 21:10 logs -rw-r--r-- 1 10011 10011   101 10月 22 2015 NOTICE.txt -rw-r--r-- 1 10011 10011  1366 10月 22 2015 README.txt drwxr-xr-x 2 10011 10011  4096 9月  13 16:59 sbin drwxr-xr-x 4 10011 10011  4096 10月 22 2015 share -rw-r--r-- 1 root  root  70397 10月 12 20:56 wordcountlog [root@zhumac1 hadoop-2.6.2]#  可以执行各种bin/hadoop  *** 等等命令。     但目前在CDH版本的目录下只有这些: [root@zhumac1 hadoop-cdh]# ll total 120 -rw-rw-r--  1 root root 12096 10月 16 2015 BUILDING.txt drwxrwxr-x  3 root root  4096 10月 16 2015 cloudera drwxrwxr-x  2 root root  4096 10月 16 2015 dev-support drwxrwxr-x  3 root root  4096 10月 16 2015 hadoop-assemblies drwxrwxr-x  2 root root  4096 10月 16 2015 hadoop-client drwxrwxr-x 10 root root  4096 10月 16 2015 hadoop-common-project drwxrwxr-x  2 root root  4096 10月 16 2015 hadoop-dist drwxrwxr-x  6 root root  4096 10月 16 2015 hadoop-hdfs-project drwxrwxr-x 10 root root  4096 10月 16 2015 hadoop-mapreduce1-project drwxrwxr-x  9 root root  4096 10月 16 2015 hadoop-mapreduce-project drwxrwxr-x  3 root root  4096 10月 16 2015 hadoop-maven-plugins drwxrwxr-x  2 root root  4096 10月 16 2015 hadoop-minicluster drwxrwxr-x  3 root root  4096 10月 16 2015 hadoop-project drwxrwxr-x  2 root root  4096 10月 16 2015 hadoop-project-dist drwxrwxr-x 16 root root  4096 10月 16 2015 hadoop-tools drwxrwxr-x  3 root root  4096 10月 16 2015 hadoop-yarn-project -rw-rw-r--  1 root root 17087 10月 16 2015 LICENSE.txt -rw-rw-r--  1 root root   101 10月 16 2015 NOTICE.txt -rw-rw-r--  1 root root 18899 10月 16 2015 pom.xml -rw-rw-r--  1 root root  1366 10月 16 2015 README.txt [root@zhumac1 hadoop-cdh]#  请问老师,我想找到具体对应于CDH版本的类似于apache的命令目录怎么办?是因为最初下载的RPM包里不包含吗?rpmbuild文件夹里怎么找?

fish - Hadooper

赞同来自:

上面那个是binary包的目录结构,下面这个是源码包的目录结构。他俩就是不一样的啊。及时是apache的源码包,跟这个binary也是不一样的。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

我懂了,现在我在小象那个下载的网址试了好几个,都不行,实在找不到CDH版本的binary安装包了,麻烦老师给一个,谢谢

fish - Hadooper

赞同来自:

CDH的binary你直接执行yum install就可以装了。已经给你的自动安装脚本就是基于yum进行安装的。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

我如果不用那个自动安装脚本,在命令行上可以安装CDH版本的二进制包吗?

fish - Hadooper

赞同来自:

yum install就能装,阅读一下脚本就能知道怎么装了。

zhudejun_1985 - 天下大事,必作于细

赞同来自:

老师,您说的脚本到底是哪一个啊?给我个地址,或者文档什么的。我现在想准确定位到安装CDH Hadoop二进制包,就是能执行命令的包的安装脚本啊。来个直接的吧,求救..........

要回复问题请先登录注册