在安装CDH版本的hadoop的时候, 我们下载下来的CDH的hadoop安装包中, 并没有native包,不支持各个任务阶段的压缩功能,但在生产任务中是必须要支持压缩功能的,所以要对CDH进行编译以支持压缩功能;
站在用户的角度思考问题,与客户深入沟通,找到岚县网站设计与岚县网站推广的解决方案,凭借多年的经验,让设计与互联网技术结合,创造个性化、用户体验好的作品,建站类型包括:成都网站建设、成都网站设计、企业官网、英文网站、手机端网站、网站推广、国际域名空间、网页空间、企业邮箱。业务覆盖岚县地区。
本次编译使用的各软件版本:
cdh:hadoop-2.6.0-cdh6.7.0-src.tar.gz jdk:1.7(经多位同学踩坑,不要使用1.8版本的) maven:apache-maven-3.3.9-bin.tar.gz protobuf:protobuf-2.5.0.tar.gz 系统:Centos 7.5 cdh官方下载地址:http://archive.cloudera.com/cdh6/cdh/5
依赖的软件百度网盘地址
repo.tar.gz | 链接:https://pan.baidu.com/s/1wGCgV_3R3VUm2ka_aVA8GQ | 提取码:lrej |
Hadoop Hadoop-2.6.0-cdh6.7.0-src.tar.gz | 链接:https://pan.baidu.com/s/1uRMGIhLSL9QHT-Ee4F16jw | 提取码:jb1d |
jdk jdk-7u80-linux-x64.tar.gz | 链接:https://pan.baidu.com/s/1xSCQ8rjABVI-zDFQS5nCPA | 提取码:lfze |
maven apache-maven-3.3.9-bin.tar.gz | 链接:https://pan.baidu.com/s/1ddkdkLW7r7ahFZmgACGkVw | 提取码:fdfz |
protobuf protobuf-2.5.0.tar.gz | 链接:https://pan.baidu.com/s/1RSNZGd_ThwknMB3vDkEfhQ | 提取码:hvc2 |
首先安装各种依赖:
yum install -y svn ncurses-devel yum install -y gcc gcc-c++ make cmake yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake cmake
配置jdk的环境变量:
[root@hadoop004 soft]# cat /etc/profile.d/java.sh export JAVA_HOME=/usr/java/jdk export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar export PATH=$JAVA_HOME/bin:$PATH [root@hadoop004 soft]# java -version java version "1.7.0_79" Java(TM) SE Runtime Environment (build 1.7.0_79-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
配置maven的环境变量:
#配置环境变量 [root@hadoop004 soft]# cat /etc/profile.d/maven.sh MAVEN_HOME=/usr/local/maven export PATH=$MAVEN_HOME/bin:$PATH [root@hadoop004 local]# mvn --version Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00) Maven home: /usr/local/maven Java version: 1.7.0_79, vendor: Oracle Corporation Java home: /usr/java/jdk1.7.0_79/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "3.10.0-862.3.2.el7.x86_64", arch: "amd64", family: "unix" #配置mvn的本地存放地址 <localRepository>/usr/local/maven/repo</localRepository> 将上面下载repo.tar.gz解压后的内容放在此本地仓库目录下,以便于cdh编译时直接使用,否则下载将浪费大量的时间 #配置mvn下载源为阿里云的maven仓库 <mirror> <id>alimaven</id> <name>aliyun maven</name> <url>http://maven.aliyun.com/nexus/content/groups/public/</url> <mirrorOf>central</mirrorOf> </mirror>
编译安装protobuf并配置环境变量:
#编译安装命令 tar -zxvf protobuf-2.5.0.tar.gz cd protobuf-2.5.0 ./configure --prefix=/root/protobuf/ make make install #配置环境变量 [root@hadoop004 soft]# cat /etc/profile.d/protobuf.sh PROTOC_HOME=/usr/local/protobuf export PATH=$PROTOC_HOME/bin:$PATH #查看是否安装成功 source /etc/profile.d/protobuf.sh [root@hadoop004 soft]# protoc --version libprotoc 2.5.0
编译cdh的hadoop源码:
tar -xzvf hadoop-2.6.0-cdh6.7.0-src.tar.gz cd hadoop-2.6.0-cdh6.7.0 mvn clean package -Pdist,native -DskipTests -Dtar
遇到的错误:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...<exec dir="/data/soft/hadoop-2.6.0-cdh6.7.0/hadoop-tools/hadoop-pipes/target/native" executable="cmake" failonerror="true">... @ 5:126 in /data/soft/hadoop-2.6.0-cdh6.7.0/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-pipes
问题原因:在安装依赖的过程,漏装了openssl-devel;yum install -y openssl-devel
再次执行上面的安装命令:
tar -xzvf hadoop-2.6.0-cdh6.7.0-src.tar.gz cd hadoop-2.6.0-cdh6.7.0 mvn clean package -Pdist,native -DskipTests -Dtar 注:由于已经提前将cdh需要下载的依赖,放入了maven的本地仓库中了,所以节省了大量的下载时间
编译后的文件:
位置:./hadoop-dist/target目录下
文件:hadoop-2.6.0-cdh6.7.0
压缩文件:hadoop-2.6.0-cdh6.7.0.tar.gz
测试查看是否支持各种压缩:
[root@hadoop004 hadoop-2.6.0-cdh6.7.0]# ./bin/hadoop checknative 19/04/18 15:09:34 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native 19/04/18 15:09:34 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /data/soft/hadoop-2.6.0-cdh6.7.0/hadoop-dist/target/hadoop-2.6.0-cdh6.7.0/lib/native/libhadoop.so.1.0.0 zlib: true /lib64/libz.so.1 snappy: true /lib64/libsnappy.so.1 lz4: true revision:99 bzip2: true /lib64/libbz2.so.1 openssl: true /lib64/libcrypto.so
移动到其它机器时的报错及解决方案:
注意如果拷贝tar包到新的机器安装检查checknative报错
openssl: false Cannot load libcrypto.so (libcrypto.so: 无法打开共享对象文件: 没有那个文件或目录)!
centos中运行yum install openssl-devel -y
文章标题:cdh编译安装支持各种压缩格式
标题链接:/article26/ipcgjg.html
成都网站建设公司_创新互联,为您提供微信小程序、网页设计公司、网站内链、品牌网站设计、企业建站、网站收录
声明:本网站发布的内容(图片、视频和文字)以用户投稿、用户转载内容为主,如果涉及侵权请尽快告知,我们将会在第一时间删除。文章观点不代表本网站立场,如需处理请联系客服。电话:028-86922220;邮箱:631063699@qq.com。内容未经允许不得转载,或转载时需注明来源: 创新互联