Hive on spark实践

Hive on spark实践

配置

利用Cloudera的CDH套件搭建好Hadoop 2.6,可CDH中的Hive版本不高,于是独立安装Hive 2.3,由于Hive的执行引擎默认是Spark,根据Hive官网上的Hive on Spark教程开始配置。

Standalone

配置了Spark Standalone模式之后,配置Hive遇到一些困难。

由于使用的是pre-build版本的Spark,遇到报错:

1
2
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

接下来要自己编译Spark without Hive,可是一直没有解决,报错为:

1
2
3
4
[info] 'compiler-interface' not yet compiled for Scala 2.11.8. Compiling...
error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)

非常曲折,最后选择尝试已经配置好的Yarn模式的Spark。

Yarn

Yarn模式的Spark集群已经就绪,在Hive中配置Spark

1
2
3
4
5
6
set hive.execution.engine=spark;
set spark.master=yarn;
set spark.executor.memory=512m;
set spark.serializer=org.apache.spark.serializer.KryoSerializer;
set spark.eventLog.enabled=true;
set spark.eventLog.dir=<Spark event log folder (must exist)>

上传Spark应用所需的Spark依赖jar包到Hdfs中,例如放到Hdfs中的/spark-jars目录,并配置hive-site.xml

1
2
3
4
<property>
<name>spark.yarn.jars</name>
<value>hdfs://xxxx:8020/spark-jars/*</value>
</property>

注意这里需要保证使用Hive时的用户能够有权限访问Hdfs上指定的目录。

用户数据导入

源数据

当源数据是JSON并位于Hdfs的/tmp目录,如

1
2
3
4
5
6
7
{
"time": "1515682813526",
"uid": 1,
"ip": "1.1.1.1",
"path": "https://github.com",
"referer": "https://google.com"
}
创建Hive外部表
1
2
3
4
5
6
7
8
9
10
CREATE EXTERNAL TABLE IF NOT EXISTS tmp.pageview(
time BIGINT,
uid BIGINT,
ip STRING,
path STRING,
referer STRING
)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
STORED AS TEXTFILE
LOCATION '/tmp';
导入ORC表

ORC表是Hive的一种高效表,在查询上更加快速,可以从外部表将数据导入ORC表。

1
2
3
4
5
6
7
8
9
10
11
12
CREATE TABLE IF NOT EXISTS orc.pageview(
time BIGINT,
uid BIGINT,
ip STRING,
path STRING,
referer STRING
)
STORED AS ORC;

FROM tmp.pageview pv
INSERT INTO TABLE orc.pageview
SELECT time, uid, ip, path, referer;

Comments

Your browser is out-of-date!

Update your browser to view this website correctly.&npsb;Update my browser now

×