Flume 使用Hive Sink 运行后 接收数据错误

项目中打算使用Flume把数据直接传到Hive表而不是HDFS上。使用hive作为Sink,Flume版本为1.7.0。Hive版位为2.1.1,Hive中使用mysql作为元数据存储。 现在问题是:Flume的配置文件中a1.sinks.k1.hive.metastore不知道该怎么配置,运行Flume时发送数据会报这里的错误! a1.sinks.k1.type = hive a1.sinks.k1.channel = c1 a1.sinks.k1.hive.metastore =thrift://192.168.100.223:9000 a1.sinks.k1.hive.database=hivedb a1.sinks.k1.hive.table=t_traffic_his a1.sinks.k1.hive.partition=%Y-%m-%d:%H:%M:%S a1.sinks.k1.hive.txnsPerBatchAsk=10 a1.sinks.k1.hive.batchSize = 1500 a1.sinks.k1.serializer = DELIMITED a1.sinks.k1.serializer.fieldnames = price,createdate a1.sinks.k1.roundUnit = hour a1.sinks.k1.roundValue = 1 部分错误日志如下: ERROR flume.SinkRunner: Unable to deliver event. Exception follows. org.apache.flume.EventDeliveryException: org.apache.flume.sink.hive.HiveWriter$ConnectException: Failed connecting to EndPoint {metaStoreUri='thrift://192.168.100.223:9000', database='hivedb', table='t_traffic_his', partitionVals=[2017-07-12] } at org.apache.flume.sink.hive.HiveSink.process(HiveSink.java:267) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:145) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.flume.sink.hive.HiveWriter$ConnectException: Failed connecting to EndPoint {metaStoreUri='thrift://192.168.100.223:9000', database='hivedb', table='t_traffic_his', partitionVals=[2017-07-12] } at org.apache.flume.sink.hive.HiveWriter.(HiveWriter.java:99) at org.apache.flume.sink.hive.HiveSink.getOrCreateWriter(HiveSink.java:343) at org.apache.flume.sink.hive.HiveSink.drainOneBatch(HiveSink.java:295) at org.apache.flume.sink.hive.HiveSink.process(HiveSink.java:253) ... 3 more Caused by: org.apache.flume.sink.hive.HiveWriter$ConnectException: Failed connecting to EndPoint {metaStoreUri='thrift://192.168.100.223:9000', database='hivedb', table='t_traffic_his', partitionVals=[2017-07-12] } at org.apache.flume.sink.hive.HiveWriter.newConnection(HiveWriter.java:383) at org.apache.flume.sink.hive.HiveWriter.(HiveWriter.java:86) ... 6 more Caused by: org.apache.hive.hcatalog.streaming.StreamingException: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V at org.apache.flume.sink.hive.HiveWriter.timedCall(HiveWriter.java:456) at org.apache.flume.sink.hive.HiveWriter.newConnection(HiveWriter.java:376) ... 7 more Caused by: java.util.concurrent.ExecutionException: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at org.apache.flume.sink.hive.HiveWriter.timedCall(HiveWriter.java:434) ... 8 more Caused by: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V at com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436) at com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:550) at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDown(HiveClientCache.java:405) at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDown

wangxiaolei

赞同来自:

org.apache.flume.sink.hive.HiveWriter$ConnectException: Failed connecting to EndPoint {metaStoreUri=’thrift://192.168.100.223:9000 网络连接拒绝 9000默认是namenode的端口,hive metastore 是 9083

wangxiaolei

赞同来自:

参考下官网,hive sink配置: http://flume.apache.org/FlumeUserGuide.html

jane3von

赞同来自:

发现hive-site.xml配置文件中如果不添加 <property>           <name>hive.metastore.uris</name>           <value>thrift://127.0.0.1:9083</value>       </property>  hive就启动正常! 添加后就报错!报错信息如下: Logging initialized using configuration in file:/usr/local/hadoop/hive-2.1.1-bin/conf/hive-log4j2.properties Async: true Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)         at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)         at java.lang.reflect.Method.invoke(Method.java:498)         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)         at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient         at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)         at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)         at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)         at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)         at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)         ... 9 more Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient         at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)         at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)         at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)         at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)         at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)         ... 14 more Caused by: java.lang.reflect.InvocationTargetException         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)         at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)         ... 23 more Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused         at org.apache.thrift.transport.TSocket.open(TSocket.java:226)         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:477)         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:285)         at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)         at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)         at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)         at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)         at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)         at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)         at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)         at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)         at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)         at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)         at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)         at java.lang.reflect.Method.invoke(Method.java:498)         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)         at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.net.ConnectException: Connection refused         at java.net.PlainSocketImpl.socketConnect(Native Method)         at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)         at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)         at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)         at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)         at java.net.Socket.connect(Socket.java:589)         at org.apache.thrift.transport.TSocket.open(TSocket.java:221)         ... 31 more )         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:525)         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:285)         at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)         ... 28 more 感觉是thrift端口没配置好,问题是该如何配置?

fish - Hadooper

赞同来自:

9083端口是好的?netstat -antp | fgrep 9083,能看到端口是监听状态么?

fish - Hadooper

赞同来自:

jps -m,看看12131 这个具体是什么进程(是thriftserver?),查看具体这个进程的log,能不能找到什么线索?

nh0823

赞同来自:

你好,我也遇到了同样的问题,请问你的问题解决了没有?

要回复问题请先登录注册