beeline连接hive on spark提交作业无返回结果

beeline连接hive on spark提交作业无返回结果
只能用hive 的cli提交查询。
后来发现beeline连接hive on tez,也是提交select count xxx后一直处于等待。
请回复,酬谢!

fish - Hadooper

赞同来自: snowyghost

看起来是编译跟运行时使用的版本不匹配导致的。 Hive、Spark、Hadoop三个系统,在编译时使用的版本,跟你实际使用时的一致么?

snowyghost

赞同来自:

hive2.2.0 spark1.6.3 hadoop2.7.1

fish - Hadooper

赞同来自:

看到的具体错误是什么呢? Hive Server上有什么错误日志么?

snowyghost

赞同来自:

通过CLI连接正常:  set hive.execution.engine=spark; Logging initialized using configuration in file:/home/hadoop/hive-2.2/conf/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive> set hive.execution.engine=spark; hive> select count(*) from wyp7; Query ID = hadoop_20180212095933_85e228c7-98ae-4dd9-82e7-c3265541df7e Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes):   set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers:   set hive.exec.reducers.max=<number> In order to set a constant number of reducers:   set mapreduce.job.reduces=<number> Starting Spark Job = 0db5ab07-9898-485d-8e77-b015e74938d7 Running with YARN Application = application_1518333864761_0008 Kill Command = /home/hadoop/hadoop-2.7/bin/yarn application -kill application_1518333864761_0008 Query Hive on Spark job[0] stages: [0, 1] Status: Running (Hive on Spark job[0]) --------------------------------------------------------------------------------------           STAGES   ATTEMPT        STATUS  TOTAL  COMPLETED  RUNNING  PENDING  FAILED   -------------------------------------------------------------------------------------- Stage-0 ........         0      FINISHED      1          1        0        0       0   Stage-1 ........         0      FINISHED      1          1        0        0       0   -------------------------------------------------------------------------------------- STAGES: 02/02    [==========================>>] 100%  ELAPSED TIME: 81.14 s     -------------------------------------------------------------------------------------- Status: Finished successfully in 81.14 seconds OK 4 Time taken: 90.03 seconds, Fetched: 1 row(s) hive> 

snowyghost

赞同来自:

beeline一直等待中: ~$ beeline -u "jdbc:hive2://master:10000/default" -d org.apache.hive.jdbc.HiveDriver  -n hadoop WARNING: Use "yarn jar" to launch YARN applications. Connecting to jdbc:hive2://master:10000/default Connected to: Apache Hive (version 2.2.0) Driver: Hive JDBC (version 2.2.0) Transaction isolation: TRANSACTION_REPEATABLE_READ Beeline version 2.2.0 by Apache Hive 0: jdbc:hive2://master:10000/default> set hive.execution.engine=spark; No rows affected (0.318 seconds) 0: jdbc:hive2://master:10000/default>  select count(*) from wyp7; 日志: 2018-02-12T10:09:44,239  INFO [Thread-55] client.SparkClientImpl: Running client driver with argv: /home/hadoop/spark-1.6/bin/spark-submit --properties-file /tmp/spark-submit.3890115630672202567.properties --class org.apache.hive.spark.client.RemoteDriver /home/hadoop/hive-2.2/lib/hive-exec-2.2.0.jar --remote-host master --remote-port 38451 --conf hive.spark.client.connect.timeout=60000 --conf hive.spark.client.server.connect.timeout=300000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null 2018-02-12T10:09:45,281  INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=300000 2018-02-12T10:09:45,281  INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8 2018-02-12T10:09:45,281  INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=60000 2018-02-12T10:09:45,281  INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256 2018-02-12T10:09:45,281  INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800 2018-02-12T10:09:45,438  INFO [stderr-redir-1] client.SparkClientImpl: Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class 2018-02-12T10:09:45,438  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.defineClass1(Native Method) 2018-02-12T10:09:45,438  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.defineClass(ClassLoader.java:763) 2018-02-12T10:09:45,438  INFO [stderr-redir-1] client.SparkClientImpl:  at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 2018-02-12T10:09:45,438  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.access$100(URLClassLoader.java:73) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader$1.run(URLClassLoader.java:368) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader$1.run(URLClassLoader.java:362) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.security.AccessController.doPrivileged(Native Method) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.findClass(URLClassLoader.java:361) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.defineClass1(Native Method) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.defineClass(ClassLoader.java:763) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.access$100(URLClassLoader.java:73) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader$1.run(URLClassLoader.java:368) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader$1.run(URLClassLoader.java:362) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.security.AccessController.doPrivileged(Native Method) 2018-02-12T10:09:45,439  INFO [stderr-redir-1] client.SparkClientImpl:  at java.net.URLClassLoader.findClass(URLClassLoader.java:361) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.Class.forName0(Native Method) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.Class.forName(Class.java:348) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.util.Utils$.classForName(Utils.scala:229) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:701) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 2018-02-12T10:09:45,440  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2018-02-12T10:09:45,454 ERROR [Thread-55] client.SparkClientImpl: Error while waiting for client to connect. java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '6f325729-ad9b-4e27-b865-4d4c6ff52ed8'. Error: Child process exited before connecting back with error log Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=300000 Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8 Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=60000 Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256 Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800 Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class         at java.lang.ClassLoader.defineClass1(Native Method)         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)         at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)         at java.security.AccessController.doPrivileged(Native Method)         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)         at java.lang.ClassLoader.defineClass1(Native Method)         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)         at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)         at java.security.AccessController.doPrivileged(Native Method)         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)         at java.lang.Class.forName0(Native Method)         at java.lang.Class.forName(Class.java:348)         at org.apache.spark.util.Utils$.classForName(Utils.scala:229)         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:701)         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)         at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) ~[netty-all-4.0.23.Final.jar:4.0.23.Final]         at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:99) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:95) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:69) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:79) ~[hive-exec-2.2.0.jar:2.2.0] Caused by: java.lang.RuntimeException: Cancel client '6f325729-ad9b-4e27-b865-4d4c6ff52ed8'. Error: Child process exited before connecting back with error log Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=300000 Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8 Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=60000 Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256 Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800 Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class         at java.lang.ClassLoader.defineClass1(Native Method)         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)         at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)         at java.security.AccessController.doPrivileged(Native Method)         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)         at java.lang.ClassLoader.defineClass1(Native Method)         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)         at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)         at java.security.AccessController.doPrivileged(Native Method)         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)         at java.lang.Class.forName0(Native Method)         at java.lang.Class.forName(Class.java:348)         at org.apache.spark.util.Utils$.classForName(Utils.scala:229)         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:701)         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)         at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179) ~[hive-exec-2.2.0.jar:2.2.0]         at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:466) ~[hive-exec-2.2.0.jar:2.2.0]         at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151] 2018-02-12T10:09:45,454  WARN [Driver] client.SparkClientImpl: Child process exited with code 1 2018-02-12T10:09:45,455 ERROR [Thread-55] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)' org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.         at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)         at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)         at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)         at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)         at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:79)

snowyghost

赞同来自:

我换成了hive2.3.2和spark2.1.2就终于可以了。 不知有没有用hive2.2成功的例子?

要回复问题请先登录注册