It seems that the remote spark context failed to come up. I saw you're
using Spark standalone cluster. Please make sure spark cluster is up. You
may try spark.master=local first.
Post by scwfyes, have placed spark-assembly jar in hive lib folder.
hive.log---
bmit.2317151720491931059.properties --class org.apache.hive.spark.client.RemoteDriver
/opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin/lib/hive-exec-1.2.0-SNAPSHOT.jar
--remote-host M151 --remote-port 56996 --conf hive.spark.client.connect.timeout=10000
--conf hive.spark.client.server.connect.timeout=90000 --conf
hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800
--conf hive.spark.client.rpc.threads=8 --conf
hive.spark.client.secret.bits=256
2015-03-02 20:33:39,893 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
property: hive.spark.client.connect.timeout=10000
2015-03-02 20:33:39,894 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
property: hive.spark.client.rpc.threads=8
2015-03-02 20:33:39,894 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
property: hive.spark.client.rpc.max.size=52428800
2015-03-02 20:33:39,894 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
property: hive.spark.client.secret.bits=256
2015-03-02 20:33:39,894 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
property: hive.spark.client.server.connect.timeout=90000
2015-03-02 20:33:40,002 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - 15/03/02 20:33:40 INFO
client.RemoteDriver: Connecting to: M151:56996
2015-03-02 20:33:40,005 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - Exception in thread "main"
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
2015-03-02 20:33:40,005 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.hive.spark.client.
rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
2015-03-02 20:33:40,005 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.hive.spark.client.
RemoteDriver.<init>(RemoteDriver.java:139)
2015-03-02 20:33:40,005 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.hive.spark.client.
RemoteDriver.main(RemoteDriver.java:544)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at sun.reflect.
NativeMethodAccessorImpl.invoke0(Native Method)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at sun.reflect.
NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at sun.reflect.
DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at java.lang.reflect.Method.
invoke(Method.java:601)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.spark.deploy.
SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(
SparkSubmit.scala:569)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.spark.deploy.
SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.spark.deploy.
SparkSubmit$.submit(SparkSubmit.scala:189)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.spark.deploy.
SparkSubmit$.main(SparkSubmit.scala:110)
2015-03-02 20:33:40,006 INFO [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(553)) - at org.apache.spark.deploy.
SparkSubmit.main(SparkSubmit.scala)
2015-03-02 20:33:40,410 WARN [Driver]: client.SparkClientImpl
(SparkClientImpl.java:run(411)) - Child process exited with code 1.
2015-03-02 20:35:08,950 WARN [main]: client.SparkClientImpl
(SparkClientImpl.java:<init>(98)) - Error while waiting for client to
connect.
Timed out waiting for client connection.
at io.netty.util.concurrent.AbstractFuture.get(
AbstractFuture.java:37)
at org.apache.hive.spark.client.SparkClientImpl.<init>(
SparkClientImpl.java:96)
at org.apache.hive.spark.client.SparkClientFactory.createClient(
SparkClientFactory.java:80)
at org.apache.hadoop.hive.ql.exec.spark.
RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.
createHiveSparkClient(HiveSparkClientFactory.java:65)
at org.apache.hadoop.hive.ql.exec.spark.session.
SparkSessionImpl.open(SparkSessionImpl.java:55)
at org.apache.hadoop.hive.ql.exec.spark.session.
SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.
getSparkSession(SparkUtilities.java:114)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(
SparkTask.java:95)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(
TaskRunner.java:88)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1642)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1402)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1187)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1043)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(
CliDriver.java:207)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(
CliDriver.java:159)
at org.apache.hadoop.hive.cli.CliDriver.processLine(
CliDriver.java:370)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(
CliDriver.java:754)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.
java:134)
at io.netty.util.concurrent.PromiseTask$RunnableAdapter.
call(PromiseTask.java:38)
at io.netty.util.concurrent.ScheduledFutureTask.run(
ScheduledFutureTask.java:123)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(
SingleThreadEventExecutor.java:380)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.
run(SingleThreadEventExecutor.java:116)
at java.lang.Thread.run(Thread.java:722)
2015-03-02 20:35:08,979 ERROR [main]: exec.Task
(SessionState.java:printError(944)) - Failed to execute spark task, with
exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
at org.apache.hadoop.hive.ql.exec.spark.session.
SparkSessionImpl.open(SparkSessionImpl.java:57)
at org.apache.hadoop.hive.ql.exec.spark.session.
SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.
getSparkSession(SparkUtilities.java:114)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(
SparkTask.java:95)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(
TaskRunner.java:88)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1642)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1402)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1187)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1043)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(
CliDriver.java:207)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(
CliDriver.java:159)
at org.apache.hadoop.hive.cli.CliDriver.processLine(
CliDriver.java:370)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(
CliDriver.java:754)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
at com.google.common.base.Throwables.propagate(
Throwables.java:156)
at org.apache.hive.spark.client.SparkClientImpl.<init>(
SparkClientImpl.java:106)
at org.apache.hive.spark.client.SparkClientFactory.createClient(
SparkClientFactory.java:80)
at org.apache.hadoop.hive.ql.exec.spark.
RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.
createHiveSparkClient(HiveSparkClientFactory.java:65)
at org.apache.hadoop.hive.ql.exec.spark.session.
SparkSessionImpl.open(SparkSessionImpl.java:55)
... 22 more
Timed out waiting for client connection.
at io.netty.util.concurrent.AbstractFuture.get(
AbstractFuture.java:37)
at org.apache.hive.spark.client.SparkClientImpl.<init>(
SparkClientImpl.java:96)
... 26 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.
java:134)
at io.netty.util.concurrent.PromiseTask$RunnableAdapter.
call(PromiseTask.java:38)
at io.netty.util.concurrent.ScheduledFutureTask.run(
ScheduledFutureTask.java:123)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(
SingleThreadEventExecutor.java:380)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.
run(SingleThreadEventExecutor.java:116)
at java.lang.Thread.run(Thread.java:722)
and i do not find spark.log
thanks.
Post by Xuefu ZhangCould you check your hive.log and spark.log for more detailed error
message? Quick check though, do you have spark-assembly.jar in your hive
lib folder?
Thanks,
Xuefu
Hi all,
anyone met this error: HiveException(Failed to create spark client.)
M151:/opt/cluster/apache-hive-__1.2.0-SNAPSHOT-bin # bin/hive
Logging initialized using configuration in
jar:file:/opt/cluster/apache-__hive-1.2.0-SNAPSHOT-bin/lib/_
_hive-common-1.2.0-SNAPSHOT.__jar!/hive-log4j.properties
backward-delete-word
backward-delete-word
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
hive> set spark.home=/opt/cluster/spark-
__1.3.0-bin-hadoop2-without-__hive;
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://9.91.8.__151:7070 <
http://9.91.8.151:7070>;
hive> select count(1) from src;
Query ID = root_20150302203333_4bed4c2a-__b9a5-4d99-a485-67570e2712b7
Total jobs = 1
Launching Job 1 out of 1
set hive.exec.reducers.bytes.per.__reducer=<number>
set hive.exec.reducers.max=<__number>
set mapreduce.job.reduces=<number>
Failed to execute spark task, with exception
'org.apache.hadoop.hive.ql.__metadata.HiveException(Failed to create
spark client.)'
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.__exec.spark.SparkTask
thanks