Kylin的有些版本官方已经下架了,Docker Hub上也没镜像了,所以需要自己搭建以下,为了以后更方便快捷地使用,就编写了一个更轻量级的Dockerfile。
准备工作
本次搭建使用的源码包来自华为云镜像站,里面有Kylin各个版本的二进制包:
https://repo.huaweicloud.com:8443/artifactory/apache-local/kylin/
以3.1.1版本为例,进入Kylin的官网可以找到安装指南Installation Guide | Apache Kylin,根据要求准备Kylin的依赖环境:
检查 Hive 的可用版本。可以访问 Apache Hive 官方存档页面,查看最新的可用版本:
https://archive.apache.org/dist/hive/
以此类推,找到HBase和Hadoop的下载链接:
https://archive.apache.org/dist/hbase/ https://archive.apache.org/dist/hadoop/common/
需要注意的是一定要根据官方指南进行版本对齐。
因为需要反复调试,所以记录下一些基本的调试命令:
# 构建容器 docker build -t kylin:3.1.1 . # 运行 sudo docker run -d --name kylin -p 7070:7070 kylin:3.1.1 # 进入容器 docker exec -it kylin /bin/bash # 查看日志 docker logs kylin # 停止容器 sudo docker stop kylin # 删除容器 sudo docker rm kylin # 查看镜像 docker images # 删除镜像 docker rmi ID # 删除所有镜像 docker rmi $(docker images -q) --force # 清除缓存 docker system prune -a
踩坑记录
问题:JAVA路径不对
解决:
# 使用 openjdk:8-jdk-alpine 作为基础镜像 FROM openjdk:8-jdk-alpine # 设置工作目录 WORKDIR /root # 默认运行 bash CMD ["/bin/sh"]
构建并进入容器:
docker build -t openjdk-alpine . sudo docker run -it openjdk-alpine
查找JAVA的真实路径并修改好Dockerfile。
问题:Failed to create hdfs:///kylin/spark-history
解决:修改/conf/kylin.properties:
# 使用本地文件系统路径 kylin.engine.spark-conf.spark.eventLog.dir=file:///tmp/kylin/spark-history kylin.engine.spark-conf.spark.history.fs.logDirectory=file:///tmp/kylin/spark-history
重新打包:
tar -czvf apache-kylin-3.1.1-bin-hbase1x.tar.gz apache-kylin-3.1.1-bin-hbase1x
别忘了创建相应的目录:
mkdir -p /tmp/kylin/spark-history && \
chmod -R 777 /tmp/kylin/spark-history
问题:运行报错Will not attempt to authenticate using SASL (unknown error)
2024-11-08 10:35:03,089 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 2024-11-08 10:35:03,091 WARN [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:1102 : Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081)
解决:HBase没启动成功,要把hbase放在启动命令CMD里。
设置容器启动时执行的命令,先启动hbase,再起hadoop,最后起kylin,最后使用死循环挂起容器
问题:Hive报错hive-site.xml不存在于/opt/hive/conf/
解决:复制一份默认配置到同目录下。
#解决缺少文件报错
cd /opt/hive/conf/
cp hive-default.xml.template hive-site.xml
问题:ERROR: Unknown error. Please check full log.
2024-11-08 10:41:12,357 INFO [close-hbase-conn] hbase.HBaseConnection:137 : Closing HBase connections... 2024-11-08 10:41:12,357 INFO [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:1676 : Closing zookeeper sessionid=0x0 2024-11-08 10:41:12,457 INFO [close-hbase-conn] zookeeper.ZooKeeper:684 : Session: 0x0 closed 2024-11-08 10:41:12,457 INFO [main-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down ERROR: Unknown error. Please check full log.
解决:替换hive-site.xml中的占位符为本地路径,否则Hive还是启动不了
vi hive-site.xml
#修改内容如下
#hive.querylog.location ==> /opt/hive/iotmp
#hive.exec.local.scratchdir ==> /opt/hive/iotmp
#hive.downloaded.resources.dir ==> /opt/hive/iotmp
当然如果觉得麻烦,还可以用awk命令一键执行上述操作:
#去掉占位符
awk '{
gsub(/\${system:java.io.tmpdir}\/\${system:user.name}/, "/opt/hive/iotmp");
gsub(/\${system:java.io.tmpdir}\/\${hive.session.id}_resources/, "/opt/hive/iotmp");
print
}' hive-site.xml > hive-site.xml.new && mv hive-site.xml.new hive-site.xml
最后别忘了创建相应的临时目录:
cd ..
mkdir iotmp
问题:ZooKeeper JMX enabled by default
Using config: /usr/local/zookeeper/bin/../conf/zoo.cfg
Starting zookeeper ... FAILED TO START
解决:版本不对,确认版本对齐。
Dockerfile
如果还是不成功,可以直接使用我的Dockerfile:
# 使用 Ubuntu 16.04 作为基础镜像
FROM ubuntu:16.04
# 更新并安装必要的工具和软件包
RUN apt-get update && apt-get install -y \
wget \
curl \
tar \
vim \
procps \
findutils \
openjdk-8-jdk \
&& apt-get clean
# 设置环境变量
ENV KYLIN_HOME /opt/kylin
ENV HADOOP_HOME /opt/hadoop
ENV HIVE_HOME /opt/hive
ENV HBASE_HOME /opt/hbase
ENV ZOOKEEPER_HOME /opt/zookeeper
ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64
ENV HADOOP_CONF_DIR $HADOOP_HOME/etc/hadoop
ENV HADOOP_COMMON_LIB_NATIVE_DIR $HADOOP_HOME/lib/native
ENV HADOOP_OPTS "-Duser.country=US -Duser.language=en"
ENV PATH $KYLIN_HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$ZOOKEEPER_HOME/bin:$PATH
# 使用 root 用户
USER root
# 下载并安装 Hadoop
RUN wget https://archive.apache.org/dist/hadoop/common/hadoop-2.7.0/hadoop-2.7.0.tar.gz -P /opt/ && \
tar -zxvf /opt/hadoop-2.7.0.tar.gz -C /opt/ && \
mv /opt/hadoop-2.7.0 /opt/hadoop
# 下载并安装 Hive
RUN wget https://archive.apache.org/dist/hive/hive-1.2.0/apache-hive-1.2.0-bin.tar.gz -P /opt/ && \
tar -zxvf /opt/apache-hive-1.2.0-bin.tar.gz -C /opt/ && \
mv /opt/apache-hive-1.2.0-bin /opt/hive
# 下载并解压 HBase
RUN wget https://archive.apache.org/dist/hbase/1.1.2/hbase-1.1.2-bin.tar.gz -P /opt/ && \
tar -zxvf /opt/hbase-1.1.2-bin.tar.gz -C /opt/ && \
mv /opt/hbase-1.1.2 /opt/hbase
# 解压 Kylin 安装包
RUN wget https://repo.huaweicloud.com:8443/artifactory/apache-local/kylin/apache-kylin-3.1.1/apache-kylin-3.1.1-bin-hbase1x.tar.gz -P /opt/ && \
mkdir -p $KYLIN_HOME && \
tar -zxvf /opt/apache-kylin-3.1.1-bin-hbase1x.tar.gz -C $KYLIN_HOME --strip-components=1
# 创建并设置 Spark 临时目录权限
RUN mkdir -p /tmp/kylin/spark-history && \
chmod -R 777 /tmp/kylin/spark-history
# 创建 Kylin 目录并设置权限
RUN mkdir -p /kylin && chmod -R 777 /kylin
# 如果你有自定义的配置文件,可以复制到容器内
# COPY conf/ $KYLIN_HOME/conf/
# 修改Kylin的配置文件,否则报错Failed to create hdfs:///kylin/spark-history
RUN echo "kylin.engine.spark-conf.spark.eventLog.dir=file:///tmp/kylin/spark-history" >> $KYLIN_HOME/conf/kylin.properties && \
echo "kylin.engine.spark-conf.spark.history.fs.logDirectory=file:///tmp/kylin/spark-history" >> $KYLIN_HOME/conf/kylin.properties
# 复制一份默认配置,否则Hive会报错hive-site.xml不存在于/opt/hive/conf/
RUN cp /opt/hive/conf/hive-default.xml.template /opt/hive/conf/hive-site.xml
# 替换hive-site.xml中的占位符为本地路径,否则Hive还是启动不了
RUN awk '{gsub(/\${system:java.io.tmpdir}\/\${system:user.name}/, "/opt/hive/iotmp"); gsub(/\${system:java.io.tmpdir}\/\${hive.session.id}_resources/, "/opt/hive/iotmp"); print}' /opt/hive/conf/hive-site.xml > /opt/hive/conf/hive-site.xml.new && \
mv /opt/hive/conf/hive-site.xml.new /opt/hive/conf/hive-site.xml && \
mkdir -p /opt/hive/iotmp
# 设置容器启动时执行的命令,先启动hbase,再起hadoop,最后起kylin,最后使用死循环挂起容器
CMD ["sh", "-c", "/opt/hbase/bin/start-hbase.sh && /opt/hadoop/sbin/start-all.sh && $KYLIN_HOME/bin/kylin.sh start && tail -f /dev/null"]
#CMD ["sh", "-c", "sleep 65535"]
Docker Logs
最终成功执行的、完整的Docker Logs,仅供参考:
$ docker logs kylin
starting master, logging to /opt/hbase/logs/hbase--master-a4709eb6289e.out
OpenJDK 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
24/11/08 11:03:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: /opt/hadoop/sbin/slaves.sh: line 60: ssh: command not found
localhost: /opt/hadoop/sbin/slaves.sh: line 60: ssh: command not found
Starting secondary namenodes [0.0.0.0]
0.0.0.0: /opt/hadoop/sbin/slaves.sh: line 60: ssh: command not found
24/11/08 11:03:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
starting yarn daemons
chown: missing operand after '/opt/hadoop/logs'
Try 'chown --help' for more information.
starting resourcemanager, logging to /opt/hadoop/logs/yarn--resourcemanager-a4709eb6289e.out
localhost: /opt/hadoop/sbin/slaves.sh: line 60: ssh: command not found
Retrieving hadoop conf dir...
...................................................[PASS]
KYLIN_HOME is set to /opt/kylin
Checking HBase
...................................................[PASS]
Checking hive
...................................................[PASS]
Checking hadoop shell
...................................................[PASS]
Checking hdfs working dir
24/11/08 11:03:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
...................................................[PASS]
Retrieving Spark dependency...
24/11/08 11:03:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24/11/08 11:03:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Optional dependency spark not found, if you need this; set SPARK_HOME, or run bin/download-spark.sh
...................................................[PASS]
Retrieving Flink dependency...
Optional dependency flink not found, if you need this; set FLINK_HOME, or run bin/download-flink.sh
...................................................[PASS]
Retrieving kafka dependency...
Couldn't find kafka home. If you want to enable streaming processing, Please set KAFKA_HOME to the path which contains kafka dependencies.
...................................................[PASS]
/opt/kylin/bin/check-port-availability.sh: line 30: netstat: command not found
Checking environment finished successfully. To check again, run 'bin/check-env.sh' manually.
Retrieving hive dependency...
Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common-1.2.0.jar!/hive-log4j.properties
export hiveWarehouseDir=/user/hive/warehouse
Retrieving hbase dependency...
Retrieving hadoop conf dir...
Retrieving kafka dependency...
Couldn't find kafka home. If you want to enable streaming processing, Please set KAFKA_HOME to the path which contains kafka dependencies.
Retrieving Spark dependency...
24/11/08 11:04:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24/11/08 11:04:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Optional dependency spark not found, if you need this; set SPARK_HOME, or run bin/download-spark.sh
Retrieving Flink dependency...
Optional dependency flink not found, if you need this; set FLINK_HOME, or run bin/download-flink.sh
Start to check whether we need to migrate acl tables
Using cached dependency...
skip spark_dependency
skip flink_dependency
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/kylin/tool/kylin-tool-3.1.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2024-11-08 11:04:15,227 INFO [main] common.KylinConfig:118 : Loading kylin-defaults.properties from file:/opt/kylin/tool/kylin-tool-3.1.1.jar!/kylin-defaults.properties
2024-11-08 11:04:15,251 INFO [main] common.KylinConfig:352 : Use KYLIN_HOME=/opt/kylin
2024-11-08 11:04:15,254 INFO [main] common.KylinConfig:153 : Initialized a new KylinConfig from getInstanceFromEnv : 280884709
2024-11-08 11:04:15,318 INFO [main] persistence.ResourceStore:90 : Using metadata url kylin_metadata@hbase for resource store
2024-11-08 11:04:15,786 WARN [main] util.NativeCodeLoader:62 : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2024-11-08 11:04:16,029 INFO [main] hbase.HBaseConnection:267 : connection is null or closed, creating a new one
2024-11-08 11:04:16,124 INFO [main] zookeeper.RecoverableZooKeeper:120 : Process identifier=hconnection-0x4d5650ae connecting to ZooKeeper ensemble=localhost:2181
2024-11-08 11:04:16,130 INFO [main] zookeeper.ZooKeeper:100 : Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2024-11-08 11:04:16,130 INFO [main] zookeeper.ZooKeeper:100 : Client environment:host.name=a4709eb6289e
2024-11-08 11:04:16,130 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.version=1.8.0_292
2024-11-08 11:04:16,131 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.vendor=Private Build
2024-11-08 11:04:16,131 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.home=/usr/lib/jvm/java-8-openjdk-amd64/jre
2024-11-08 11:04:16,131 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.class.path=/opt/kylin/tool/kylin-tool-3.1.1.jar:/opt/kylin/conf:/opt/kylin/lib/kylin-jdbc-3.1.1.jar:/opt/kylin/lib/kylin-coprocessor-3.1.1.jar:/opt/kylin/lib/kylin-job-3.1.1.jar:/opt/kylin/lib/kylin-datasource-sdk-3.1.1.jar:/opt/kylin/ext/*:/opt/kylin/bin/../tomcat/bin/bootstrap.jar:/opt/kylin/bin/../tomcat/bin/tomcat-juli.jar:/opt/kylin/bin/../tomcat/lib/jasper-el.jar:/opt/kylin/bin/../tomcat/lib/catalina-ant.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-zh-CN.jar:/opt/kylin/bin/../tomcat/lib/tomcat-api.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-ko.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-ru.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-de.jar:/opt/kylin/bin/../tomcat/lib/catalina.jar:/opt/kylin/bin/../tomcat/lib/tomcat-util.jar:/opt/kylin/bin/../tomcat/lib/tomcat-dbcp.jar:/opt/kylin/bin/../tomcat/lib/ecj-4.4.2.jar:/opt/kylin/bin/../tomcat/lib/jsp-api.jar:/opt/kylin/bin/../tomcat/lib/tomcat-coyote.jar:/opt/kylin/bin/../tomcat/lib/tomcat7-websocket.jar:/opt/kylin/bin/../tomcat/lib/websocket-api.jar:/opt/kylin/bin/../tomcat/lib/catalina-ha.jar:/opt/kylin/bin/../tomcat/lib/annotations-api.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-ja.jar:/opt/kylin/bin/../tomcat/lib/kylin-tomcat-ext-3.1.1.jar:/opt/kylin/bin/../tomcat/lib/catalina-tribes.jar:/opt/kylin/bin/../tomcat/lib/servlet-api.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-fr.jar:/opt/kylin/bin/../tomcat/lib/tomcat-jdbc.jar:/opt/kylin/bin/../tomcat/lib/el-api.jar:/opt/kylin/bin/../tomcat/lib/jasper.jar:/opt/kylin/bin/../tomcat/lib/tomcat-i18n-es.jar:/opt/kylin/conf:/opt/kylin/lib/kylin-jdbc-3.1.1.jar:/opt/kylin/lib/kylin-coprocessor-3.1.1.jar:/opt/kylin/lib/kylin-job-3.1.1.jar:/opt/kylin/lib/kylin-datasource-sdk-3.1.1.jar:/opt/kylin/ext/*::/opt/hbase/conf:/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar:/opt/hbase:/opt/hbase/lib/activation-1.1.jar:/opt/hbase/lib/aopalliance-1.0.jar:/opt/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase/lib/api-util-1.0.0-M20.jar:/opt/hbase/lib/asm-3.1.jar:/opt/hbase/lib/avro-1.7.4.jar:/opt/hbase/lib/commons-beanutils-1.7.0.jar:/opt/hbase/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/lib/commons-cli-1.2.jar:/opt/hbase/lib/commons-codec-1.9.jar:/opt/hbase/lib/commons-collections-3.2.1.jar:/opt/hbase/lib/commons-compress-1.4.1.jar:/opt/hbase/lib/commons-configuration-1.6.jar:/opt/hbase/lib/commons-daemon-1.0.13.jar:/opt/hbase/lib/commons-digester-1.8.jar:/opt/hbase/lib/commons-el-1.0.jar:/opt/hbase/lib/commons-httpclient-3.1.jar:/opt/hbase/lib/commons-io-2.4.jar:/opt/hbase/lib/commons-lang-2.6.jar:/opt/hbase/lib/commons-logging-1.2.jar:/opt/hbase/lib/commons-math-2.2.jar:/opt/hbase/lib/commons-math3-3.1.1.jar:/opt/hbase/lib/commons-net-3.1.jar:/opt/hbase/lib/disruptor-3.3.0.jar:/opt/hbase/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase/lib/guava-12.0.1.jar:/opt/hbase/lib/guice-3.0.jar:/opt/hbase/lib/guice-servlet-3.0.jar:/opt/hbase/lib/hadoop-annotations-2.5.1.jar:/opt/hbase/lib/hadoop-auth-2.5.1.jar:/opt/hbase/lib/hadoop-client-2.5.1.jar:/opt/hbase/lib/hadoop-common-2.5.1.jar:/opt/hbase/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase/lib/hbase-annotations-1.1.2-tests.jar:/opt/hbase/lib/hbase-annotations-1.1.2.jar:/opt/hbase/lib/hbase-client-1.1.2.jar:/opt/hbase/lib/hbase-common-1.1.2-tests.jar:/opt/hbase/lib/hbase-common-1.1.2.jar:/opt/hbase/lib/hbase-examples-1.1.2.jar:/opt/hbase/lib/hbase-hadoop-compat-1.1.2.jar:/opt/hbase/lib/hbase-hadoop2-compat-1.1.2.jar:/opt/hbase/lib/hbase-it-1.1.2-tests.jar:/opt/hbase/lib/hbase-it-1.1.2.jar:/opt/hbase/lib/hbase-prefix-tree-1.1.2.jar:/opt/hbase/lib/hbase-procedure-1.1.2.jar:/opt/hbase/lib/hbase-protocol-1.1.2.jar:/opt/hbase/lib/hbase-resource-bundle-1.1.2.jar:/opt/hbase/lib/hbase-rest-1.1.2.jar:/opt/hbase/lib/hbase-server-1.1.2-tests.jar:/opt/hbase/lib/hbase-server-1.1.2.jar:/opt/hbase/lib/hbase-shell-1.1.2.jar:/opt/hbase/lib/hbase-thrift-1.1.2.jar:/opt/hbase/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase/lib/httpclient-4.2.5.jar:/opt/hbase/lib/httpcore-4.1.3.jar:/opt/hbase/lib/jackson-core-asl-1.9.13.jar:/opt/hbase/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase/lib/jackson-xc-1.9.13.jar:/opt/hbase/lib/jamon-runtime-2.3.1.jar:/opt/hbase/lib/jasper-compiler-5.5.23.jar:/opt/hbase/lib/jasper-runtime-5.5.23.jar:/opt/hbase/lib/java-xmlbuilder-0.4.jar:/opt/hbase/lib/javax.inject-1.jar:/opt/hbase/lib/jaxb-api-2.2.2.jar:/opt/hbase/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase/lib/jcodings-1.0.8.jar:/opt/hbase/lib/jersey-client-1.9.jar:/opt/hbase/lib/jersey-core-1.9.jar:/opt/hbase/lib/jersey-guice-1.9.jar:/opt/hbase/lib/jersey-json-1.9.jar:/opt/hbase/lib/jersey-server-1.9.jar:/opt/hbase/lib/jets3t-0.9.0.jar:/opt/hbase/lib/jettison-1.3.3.jar:/opt/hbase/lib/jetty-6.1.26.jar:/opt/hbase/lib/jetty-sslengine-6.1.26.jar:/opt/hbase/lib/jetty-util-6.1.26.jar:/opt/hbase/lib/joni-2.1.2.jar:/opt/hbase/lib/jruby-complete-1.6.8.jar:/opt/hbase/lib/jsch-0.1.42.jar:/opt/hbase/lib/jsp-2.1-6.1.14.jar:/opt/hbase/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase/lib/jsr305-1.3.9.jar:/opt/hbase/lib/junit-4.11.jar:/opt/hbase/lib/leveldbjni-all-1.8.jar:/opt/hbase/lib/libthrift-0.9.0.jar:/opt/hbase/lib/log4j-1.2.17.jar:/opt/hbase/lib/metrics-core-2.2.0.jar:/opt/hbase/lib/netty-3.2.4.Final.jar:/opt/hbase/lib/netty-all-4.0.23.Final.jar:/opt/hbase/lib/paranamer-2.3.jar:/opt/hbase/lib/protobuf-java-2.5.0.jar:/opt/hbase/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase/lib/servlet-api-2.5.jar:/opt/hbase/lib/slf4j-api-1.7.7.jar:/opt/hbase/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase/lib/snappy-java-1.0.4.1.jar:/opt/hbase/lib/spymemcached-2.11.6.jar:/opt/hbase/lib/xmlenc-0.52.jar:/opt/hbase/lib/xz-1.0.jar:/opt/hbase/lib/zookeeper-3.4.6.jar:/opt/hadoop/etc/hadoop:/opt/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/opt/hadoop/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/opt/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/opt/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/opt/hadoop/share/hadoop/common/lib/hadoop-annotations-2.7.0.jar:/opt/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/opt/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/opt/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/opt/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/hadoop/share/hadoop/common/lib/hadoop-auth-2.7.0.jar:/opt/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/opt/hadoop/share/hadoop/common/lib/activation-1.1.jar:/opt/hadoop/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/opt/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/opt/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/opt/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/opt/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/opt/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/opt/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/common/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/common/lib/curator-client-2.7.1.jar:/opt/hadoop/share/hadoop/common/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/opt/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/opt/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/common/lib/junit-4.11.jar:/opt/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/opt/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/opt/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/opt/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/opt/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/opt/hadoop/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/opt/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hadoop/share/hadoop/common/lib/curator-framework-2.7.1.jar:/opt/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/common/hadoop-common-2.7.0.jar:/opt/hadoop/share/hadoop/common/hadoop-common-2.7.0-tests.jar:/opt/hadoop/share/hadoop/common/hadoop-nfs-2.7.0.jar:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/opt/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/opt/hadoop/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/opt/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/opt/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/opt/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.7.0-tests.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.0.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/opt/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/opt/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/opt/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/opt/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/opt/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/opt/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/opt/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/opt/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/opt/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/opt/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/opt/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/opt/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.0-tests.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.0.jar:/opt/hadoop/contrib/capacity-scheduler/*.jar::/opt/hive/conf:/opt/hive/lib/jcommander-1.32.jar:/opt/hive/lib/zookeeper-3.4.6.jar:/opt/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/opt/hive/lib/hive-shims-scheduler-1.2.0.jar:/opt/hive/lib/super-csv-2.2.0.jar:/opt/hive/lib/curator-client-2.6.0.jar:/opt/hive/lib/commons-compiler-2.7.6.jar:/opt/hive/lib/commons-io-2.4.jar:/opt/hive/lib/commons-pool-1.5.4.jar:/opt/hive/lib/ant-1.9.1.jar:/opt/hive/lib/antlr-runtime-3.4.jar:/opt/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/opt/hive/lib/hive-accumulo-handler-1.2.0.jar:/opt/hive/lib/hive-hwi-1.2.0.jar:/opt/hive/lib/bonecp-0.8.0.RELEASE.jar:/opt/hive/lib/hive-contrib-1.2.0.jar:/opt/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/opt/hive/lib/hamcrest-core-1.1.jar:/opt/hive/lib/datanucleus-core-3.2.10.jar:/opt/hive/lib/hive-common-1.2.0.jar:/opt/hive/lib/plexus-utils-1.5.6.jar:/opt/hive/lib/libthrift-0.9.2.jar:/opt/hive/lib/groovy-all-2.1.6.jar:/opt/hive/lib/hive-jdbc-1.2.0.jar:/opt/hive/lib/hive-shims-0.20S-1.2.0.jar:/opt/hive/lib/maven-scm-api-1.4.jar:/opt/hive/lib/janino-2.7.6.jar:/opt/hive/lib/opencsv-2.3.jar:/opt/hive/lib/hive-ant-1.2.0.jar:/opt/hive/lib/avro-1.7.5.jar:/opt/hive/lib/commons-digester-1.8.jar:/opt/hive/lib/accumulo-start-1.6.0.jar:/opt/hive/lib/stax-api-1.0.1.jar:/opt/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/opt/hive/lib/activation-1.1.jar:/opt/hive/lib/libfb303-0.9.2.jar:/opt/hive/lib/velocity-1.5.jar:/opt/hive/lib/commons-lang-2.6.jar:/opt/hive/lib/commons-dbcp-1.4.jar:/opt/hive/lib/antlr-2.7.7.jar:/opt/hive/lib/accumulo-core-1.6.0.jar:/opt/hive/lib/parquet-hadoop-bundle-1.6.0.jar:/opt/hive/lib/json-20090211.jar:/opt/hive/lib/datanucleus-rdbms-3.2.9.jar:/opt/hive/lib/commons-compress-1.4.1.jar:/opt/hive/lib/httpcore-4.4.jar:/opt/hive/lib/jetty-all-7.6.0.v20120127.jar:/opt/hive/lib/paranamer-2.3.jar:/opt/hive/lib/commons-httpclient-3.0.1.jar:/opt/hive/lib/jta-1.1.jar:/opt/hive/lib/hive-exec-1.2.0.jar:/opt/hive/lib/asm-tree-3.1.jar:/opt/hive/lib/maven-scm-provider-svnexe-1.4.jar:/opt/hive/lib/hive-service-1.2.0.jar:/opt/hive/lib/commons-codec-1.4.jar:/opt/hive/lib/apache-log4j-extras-1.2.17.jar:/opt/hive/lib/guava-14.0.1.jar:/opt/hive/lib/hive-hbase-handler-1.2.0.jar:/opt/hive/lib/eigenbase-properties-1.1.5.jar:/opt/hive/lib/asm-commons-3.1.jar:/opt/hive/lib/ivy-2.4.0.jar:/opt/hive/lib/commons-beanutils-1.7.0.jar:/opt/hive/lib/commons-vfs2-2.0.jar:/opt/hive/lib/commons-cli-1.2.jar:/opt/hive/lib/servlet-api-2.5.jar:/opt/hive/lib/hive-shims-1.2.0.jar:/opt/hive/lib/stringtemplate-3.2.1.jar:/opt/hive/lib/hive-shims-common-1.2.0.jar:/opt/hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/opt/hive/lib/xz-1.0.jar:/opt/hive/lib/hive-serde-1.2.0.jar:/opt/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/opt/hive/lib/hive-testutils-1.2.0.jar:/opt/hive/lib/jdo-api-3.0.1.jar:/opt/hive/lib/hive-cli-1.2.0.jar:/opt/hive/lib/commons-configuration-1.6.jar:/opt/hive/lib/accumulo-trace-1.6.0.jar:/opt/hive/lib/curator-framework-2.6.0.jar:/opt/hive/lib/joda-time-2.5.jar:/opt/hive/lib/snappy-java-1.0.5.jar:/opt/hive/lib/httpclient-4.4.jar:/opt/hive/lib/datanucleus-api-jdo-3.2.6.jar:/opt/hive/lib/jpam-1.1.jar:/opt/hive/lib/netty-3.7.0.Final.jar:/opt/hive/lib/commons-beanutils-core-1.8.0.jar:/opt/hive/lib/commons-logging-1.1.3.jar:/opt/hive/lib/junit-4.11.jar:/opt/hive/lib/mail-1.4.1.jar:/opt/hive/lib/oro-2.0.8.jar:/opt/hive/lib/jsr305-3.0.0.jar:/opt/hive/lib/log4j-1.2.16.jar:/opt/hive/lib/hive-metastore-1.2.0.jar:/opt/hive/lib/hive-shims-0.23-1.2.0.jar:/opt/hive/lib/ant-launcher-1.9.1.jar:/opt/hive/lib/regexp-1.3.jar:/opt/hive/lib/commons-collections-3.2.1.jar:/opt/hive/lib/hive-beeline-1.2.0.jar:/opt/hive/lib/hive-jdbc-1.2.0-standalone.jar:/opt/hive/lib/tempus-fugit-1.1.jar:/opt/hive/lib/jline-2.12.jar:/opt/hive/lib/commons-math-2.1.jar:/opt/hive/lib/ST4-4.0.4.jar:/opt/hive/lib/accumulo-fate-1.6.0.jar:/opt/hive/lib/curator-recipes-2.6.0.jar:/opt/hive/hcatalog/share/hcatalog/hive-hcatalog-core-1.2.0.jar::::/opt/hive/conf:/opt/hive/lib/jcommander-1.32.jar:/opt/hive/lib/zookeeper-3.4.6.jar:/opt/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/opt/hive/lib/hive-shims-scheduler-1.2.0.jar:/opt/hive/lib/super-csv-2.2.0.jar:/opt/hive/lib/curator-client-2.6.0.jar:/opt/hive/lib/commons-compiler-2.7.6.jar:/opt/hive/lib/commons-io-2.4.jar:/opt/hive/lib/commons-pool-1.5.4.jar:/opt/hive/lib/ant-1.9.1.jar:/opt/hive/lib/antlr-runtime-3.4.jar:/opt/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/opt/hive/lib/hive-accumulo-handler-1.2.0.jar:/opt/hive/lib/hive-hwi-1.2.0.jar:/opt/hive/lib/bonecp-0.8.0.RELEASE.jar:/opt/hive/lib/hive-contrib-1.2.0.jar:/opt/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/opt/hive/lib/hamcrest-core-1.1.jar:/opt/hive/lib/datanucleus-core-3.2.10.jar:/opt/hive/lib/hive-common-1.2.0.jar:/opt/hive/lib/plexus-utils-1.5.6.jar:/opt/hive/lib/libthrift-0.9.2.jar:/opt/hive/lib/groovy-all-2.1.6.jar:/opt/hive/lib/hive-jdbc-1.2.0.jar:/opt/hive/lib/hive-shims-0.20S-1.2.0.jar:/opt/hive/lib/maven-scm-api-1.4.jar:/opt/hive/lib/janino-2.7.6.jar:/opt/hive/lib/opencsv-2.3.jar:/opt/hive/lib/hive-ant-1.2.0.jar:/opt/hive/lib/avro-1.7.5.jar:/opt/hive/lib/commons-digester-1.8.jar:/opt/hive/lib/accumulo-start-1.6.0.jar:/opt/hive/lib/stax-api-1.0.1.jar:/opt/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/opt/hive/lib/activation-1.1.jar:/opt/hive/lib/libfb303-0.9.2.jar:/opt/hive/lib/velocity-1.5.jar:/opt/hive/lib/commons-lang-2.6.jar:/opt/hive/lib/commons-dbcp-1.4.jar:/opt/hive/lib/antlr-2.7.7.jar:/opt/hive/lib/accumulo-core-1.6.0.jar:/opt/hive/lib/parquet-hadoop-bundle-1.6.0.jar:/opt/hive/lib/json-20090211.jar:/opt/hive/lib/datanucleus-rdbms-3.2.9.jar:/opt/hive/lib/commons-compress-1.4.1.jar:/opt/hive/lib/httpcore-4.4.jar:/opt/hive/lib/jetty-all-7.6.0.v20120127.jar:/opt/hive/lib/paranamer-2.3.jar:/opt/hive/lib/commons-httpclient-3.0.1.jar:/opt/hive/lib/jta-1.1.jar:/opt/hive/lib/hive-exec-1.2.0.jar:/opt/hive/lib/asm-tree-3.1.jar:/opt/hive/lib/maven-scm-provider-svnexe-1.4.jar:/opt/hive/lib/hive-service-1.2.0.jar:/opt/hive/lib/commons-codec-1.4.jar:/opt/hive/lib/apache-log4j-extras-1.2.17.jar:/opt/hive/lib/guava-14.0.1.jar:/opt/hive/lib/hive-hbase-handler-1.2.0.jar:/opt/hive/lib/eigenbase-properties-1.1.5.jar:/opt/hive/lib/asm-commons-3.1.jar:/opt/hive/lib/ivy-2.4.0.jar:/opt/hive/lib/commons-beanutils-1.7.0.jar:/opt/hive/lib/commons-vfs2-2.0.jar:/opt/hive/lib/commons-cli-1.2.jar:/opt/hive/lib/servlet-api-2.5.jar:/opt/hive/lib/hive-shims-1.2.0.jar:/opt/hive/lib/stringtemplate-3.2.1.jar:/opt/hive/lib/hive-shims-common-1.2.0.jar:/opt/hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/opt/hive/lib/xz-1.0.jar:/opt/hive/lib/hive-serde-1.2.0.jar:/opt/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/opt/hive/lib/hive-testutils-1.2.0.jar:/opt/hive/lib/jdo-api-3.0.1.jar:/opt/hive/lib/hive-cli-1.2.0.jar:/opt/hive/lib/commons-configuration-1.6.jar:/opt/hive/lib/accumulo-trace-1.6.0.jar:/opt/hive/lib/curator-framework-2.6.0.jar:/opt/hive/lib/joda-time-2.5.jar:/opt/hive/lib/snappy-java-1.0.5.jar:/opt/hive/lib/httpclient-4.4.jar:/opt/hive/lib/datanucleus-api-jdo-3.2.6.jar:/opt/hive/lib/jpam-1.1.jar:/opt/hive/lib/netty-3.7.0.Final.jar:/opt/hive/lib/commons-beanutils-core-1.8.0.jar:/opt/hive/lib/commons-logging-1.1.3.jar:/opt/hive/lib/junit-4.11.jar:/opt/hive/lib/mail-1.4.1.jar:/opt/hive/lib/oro-2.0.8.jar:/opt/hive/lib/jsr305-3.0.0.jar:/opt/hive/lib/log4j-1.2.16.jar:/opt/hive/lib/hive-metastore-1.2.0.jar:/opt/hive/lib/hive-shims-0.23-1.2.0.jar:/opt/hive/lib/ant-launcher-1.9.1.jar:/opt/hive/lib/regexp-1.3.jar:/opt/hive/lib/commons-collections-3.2.1.jar:/opt/hive/lib/hive-beeline-1.2.0.jar:/opt/hive/lib/hive-jdbc-1.2.0-standalone.jar:/opt/hive/lib/tempus-fugit-1.1.jar:/opt/hive/lib/jline-2.12.jar:/opt/hive/lib/commons-math-2.1.jar:/opt/hive/lib/ST4-4.0.4.jar:/opt/hive/lib/accumulo-fate-1.6.0.jar:/opt/hive/lib/curator-recipes-2.6.0.jar:/opt/hive/hcatalog/share/hcatalog/hive-hcatalog-core-1.2.0.jar:::
2024-11-08 11:04:16,131 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
2024-11-08 11:04:16,132 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.io.tmpdir=/tmp
2024-11-08 11:04:16,132 INFO [main] zookeeper.ZooKeeper:100 : Client environment:java.compiler=<NA>
2024-11-08 11:04:16,133 INFO [main] zookeeper.ZooKeeper:100 : Client environment:os.name=Linux
2024-11-08 11:04:16,133 INFO [main] zookeeper.ZooKeeper:100 : Client environment:os.arch=amd64
2024-11-08 11:04:16,134 INFO [main] zookeeper.ZooKeeper:100 : Client environment:os.version=5.4.0-54-generic
2024-11-08 11:04:16,134 INFO [main] zookeeper.ZooKeeper:100 : Client environment:user.name=root
2024-11-08 11:04:16,134 INFO [main] zookeeper.ZooKeeper:100 : Client environment:user.home=/root
2024-11-08 11:04:16,134 INFO [main] zookeeper.ZooKeeper:100 : Client environment:user.dir=/
2024-11-08 11:04:16,135 INFO [main] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x4d5650ae0x0, quorum=localhost:2181, baseZNode=/hbase
2024-11-08 11:04:16,153 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2024-11-08 11:04:16,158 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:2181, initiating session
2024-11-08 11:04:16,165 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1930b7196420007, negotiated timeout = 40000
2024-11-08 11:04:16,646 INFO [main] common.KylinConfigBase:238 : Kylin Config was updated with kylin.server.cluster-name : kylin_metadata
2024-11-08 11:04:16,744 INFO [main] util.ZKUtil:165 : zookeeper connection string: localhost:2181 with namespace /kylin/kylin_metadata
2024-11-08 11:04:16,804 INFO [main] imps.CuratorFrameworkImpl:235 : Starting
2024-11-08 11:04:16,806 INFO [main] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:2181/kylin/kylin_metadata sessionTimeout=120000 watcher=org.apache.curator.ConnectionState@7f0d96f2
2024-11-08 11:04:16,808 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2024-11-08 11:04:16,809 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:2181, initiating session
2024-11-08 11:04:16,812 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1930b7196420008, negotiated timeout = 40000
2024-11-08 11:04:16,817 INFO [main] util.ZKUtil:169 : new zookeeper Client start: localhost:2181
2024-11-08 11:04:16,820 INFO [main-EventThread] state.ConnectionStateManager:228 : State change: CONNECTED
2024-11-08 11:04:16,828 INFO [main] imps.CuratorFrameworkImpl:235 : Starting
2024-11-08 11:04:16,829 INFO [main] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:2181 sessionTimeout=120000 watcher=org.apache.curator.ConnectionState@3fc9504b
2024-11-08 11:04:16,830 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2024-11-08 11:04:16,846 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:2181, initiating session
2024-11-08 11:04:16,850 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1930b7196420009, negotiated timeout = 40000
2024-11-08 11:04:16,850 INFO [main-EventThread] state.ConnectionStateManager:228 : State change: CONNECTED
2024-11-08 11:04:16,864 INFO [Curator-Framework-0] imps.CuratorFrameworkImpl:821 : backgroundOperationsLoop exiting
2024-11-08 11:04:16,869 INFO [main] zookeeper.ZooKeeper:684 : Session: 0x1930b7196420009 closed
2024-11-08 11:04:16,870 INFO [main-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down
2024-11-08 11:04:16,883 INFO [main] zookeeper.ZookeeperDistributedLock:114 : 7294@a4709eb6289e acquired lock at /create_htable/kylin_metadata/lock
2024-11-08 11:04:18,300 INFO [main] client.HBaseAdmin:669 : Created kylin_metadata
2024-11-08 11:04:18,305 INFO [main] zookeeper.ZookeeperDistributedLock:289 : 7294@a4709eb6289e released lock at /create_htable/kylin_metadata/lock
2024-11-08 11:04:18,436 INFO [main] hbase.HBaseConnection:267 : connection is null or closed, creating a new one
2024-11-08 11:04:18,437 INFO [main] zookeeper.RecoverableZooKeeper:120 : Process identifier=hconnection-0x3e10dc6 connecting to ZooKeeper ensemble=localhost:2181
2024-11-08 11:04:18,437 INFO [main] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x3e10dc60x0, quorum=localhost:2181, baseZNode=/hbase
2024-11-08 11:04:18,442 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2024-11-08 11:04:18,443 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:2181, initiating session
2024-11-08 11:04:18,445 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1930b719642000a, negotiated timeout = 40000
2024-11-08 11:04:18,461 INFO [Thread-5] util.ZKUtil:93 : Going to remove 1 cached curator clients
2024-11-08 11:04:18,463 INFO [Thread-5] util.ZKUtil:78 : CuratorFramework for zkString localhost:2181 is removed due to EXPLICIT
2024-11-08 11:04:18,463 INFO [Curator-Framework-0] imps.CuratorFrameworkImpl:821 : backgroundOperationsLoop exiting
2024-11-08 11:04:18,464 INFO [close-hbase-conn] hbase.HBaseConnection:137 : Closing HBase connections...
2024-11-08 11:04:18,465 INFO [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:1676 : Closing zookeeper sessionid=0x1930b719642000a
2024-11-08 11:04:18,466 INFO [Thread-5] zookeeper.ZooKeeper:684 : Session: 0x1930b7196420008 closed
2024-11-08 11:04:18,466 INFO [main-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down
2024-11-08 11:04:18,467 INFO [main-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down
2024-11-08 11:04:18,468 INFO [close-hbase-conn] zookeeper.ZooKeeper:684 : Session: 0x1930b719642000a closed
2024-11-08 11:04:18,568 INFO [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:2068 : Closing master protocol: MasterService
2024-11-08 11:04:18,571 INFO [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:1676 : Closing zookeeper sessionid=0x1930b7196420007
2024-11-08 11:04:18,573 INFO [close-hbase-conn] zookeeper.ZooKeeper:684 : Session: 0x1930b7196420007 closed
2024-11-08 11:04:18,573 INFO [main-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down
A new Kylin instance is started by . To stop it, run 'kylin.sh stop'
Check the log at /opt/kylin/logs/kylin.log
Web UI is at http://a4709eb6289e:7070/kylin
模板套用
刚才只是以3.1.1版本为例,其实其他版本也是同理,例如调试后可运行的2.3.1版本的Dockerfile:
# 使用 Ubuntu 16.04 作为基础镜像
FROM ubuntu:16.04
# 更新并安装必要的工具和软件包
RUN apt-get update && apt-get install -y \
wget \
curl \
tar \
vim \
procps \
findutils \
openjdk-8-jdk \
&& apt-get clean
# 设置环境变量
ENV KYLIN_HOME /opt/kylin
ENV HADOOP_HOME /opt/hadoop
ENV HIVE_HOME /opt/hive
ENV HBASE_HOME /opt/hbase
ENV ZOOKEEPER_HOME /opt/zookeeper
ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64
ENV HADOOP_CONF_DIR $HADOOP_HOME/etc/hadoop
ENV HADOOP_COMMON_LIB_NATIVE_DIR $HADOOP_HOME/lib/native
ENV HADOOP_OPTS "-Duser.country=US -Duser.language=en"
ENV PATH $KYLIN_HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$ZOOKEEPER_HOME/bin:$PATH
# 使用 root 用户
USER root
# 下载并安装 Hadoop
RUN wget https://archive.apache.org/dist/hadoop/common/hadoop-2.7.0/hadoop-2.7.0.tar.gz -P /opt/ && \
tar -zxvf /opt/hadoop-2.7.0.tar.gz -C /opt/ && \
mv /opt/hadoop-2.7.0 /opt/hadoop
# 下载并安装 Hive
RUN wget https://archive.apache.org/dist/hive/hive-1.2.0/apache-hive-1.2.0-bin.tar.gz -P /opt/ && \
tar -zxvf /opt/apache-hive-1.2.0-bin.tar.gz -C /opt/ && \
mv /opt/apache-hive-1.2.0-bin /opt/hive
# 下载并解压 HBase
RUN wget https://archive.apache.org/dist/hbase/1.1.2/hbase-1.1.2-bin.tar.gz -P /opt/ && \
tar -zxvf /opt/hbase-1.1.2-bin.tar.gz -C /opt/ && \
mv /opt/hbase-1.1.2 /opt/hbase
# 解压 Kylin 安装包
RUN wget https://repo.huaweicloud.com:8443/artifactory/apache-local/kylin/apache-kylin-2.3.1/apache-kylin-2.3.1-hbase1x-bin.tar.gz -P /opt/ && \
mkdir -p $KYLIN_HOME && \
tar -zxvf /opt/apache-kylin-2.3.1-hbase1x-bin.tar.gz -C $KYLIN_HOME --strip-components=1
# 创建并设置 Spark 临时目录权限
RUN mkdir -p /tmp/kylin/spark-history && \
chmod -R 777 /tmp/kylin/spark-history
# 创建 Kylin 目录并设置权限
RUN mkdir -p /kylin && chmod -R 777 /kylin
# 如果你有自定义的配置文件,可以复制到容器内
# COPY conf/ $KYLIN_HOME/conf/
# 修改Kylin的配置文件,否则报错Failed to create hdfs:///kylin/spark-history
RUN echo "kylin.engine.spark-conf.spark.eventLog.dir=file:///tmp/kylin/spark-history" >> $KYLIN_HOME/conf/kylin.properties && \
echo "kylin.engine.spark-conf.spark.history.fs.logDirectory=file:///tmp/kylin/spark-history" >> $KYLIN_HOME/conf/kylin.properties
# 复制一份默认配置,否则Hive会报错hive-site.xml不存在于/opt/hive/conf/
RUN cp /opt/hive/conf/hive-default.xml.template /opt/hive/conf/hive-site.xml
# 替换hive-site.xml中的占位符为本地路径,否则Hive还是启动不了
RUN awk '{gsub(/\${system:java.io.tmpdir}\/\${system:user.name}/, "/opt/hive/iotmp"); gsub(/\${system:java.io.tmpdir}\/\${hive.session.id}_resources/, "/opt/hive/iotmp"); print}' /opt/hive/conf/hive-site.xml > /opt/hive/conf/hive-site.xml.new && \
mv /opt/hive/conf/hive-site.xml.new /opt/hive/conf/hive-site.xml && \
mkdir -p /opt/hive/iotmp
# 设置容器启动时执行的命令,先启动hbase,再起hadoop,最后起kylin,最后使用死循环挂起容器
CMD ["sh", "-c", "/opt/hbase/bin/start-hbase.sh && /opt/hadoop/sbin/start-all.sh && $KYLIN_HOME/bin/kylin.sh start && tail -f /dev/null"]
#CMD ["sh", "-c", "sleep 65535"]