Hiveserver2 не подходит для бигтопа - PullRequest
0 голосов
/ 01 апреля 2020

Мы строим ванильный кластер oop 2.7.3 с hive и hbase + kerberos. Мы используем репо bigtop из had oop, чтобы упростить его.

Сценарий развертывания успешно устанавливает куст и компоненты, но, несмотря на то, что у нас работает metastore и hiveserver,

  • он не использует порт 10000,
  • мы не можем для подключения к beeline.
  • Нет ошибок
  • он даже не создает файл hiveserver2.log.

ps -ef | grep hive показывает следующий вывод

   hive      9043     1  2 10:57 ?        00:00:23 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hadoop.hive.metastore.HiveMetaStore
hive      9751     1  2 11:04 ?        00:00:11 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hive.service.server.HiveServer2
root     10285  7469  0 11:13 pts/1    00:00:00 grep hive

Подключение к beeline

[root@wnode55 ~]# beeline -u 'jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM'
ls: cannot access /usr/lib/spark/lib/spark-assembly-*.jar: No such file or directory
Connecting to jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM
Error: Could not open client transport with JDBC Uri: jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
Beeline version 1.2.1 by Apache Hive
0: jdbc:hive2://wnode55.domain_name.com:10000 (closed)>

hive-site. xml

[root@wnode55 ~]# cat /etc/hive/conf/hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Licensed to the Apache Software Foundation (ASF) under one or more       -->
<!-- contributor license agreements.  See the NOTICE file distributed with    -->
<!-- this work for additional information regarding copyright ownership.      -->
<!-- The ASF licenses this file to You under the Apache License, Version 2.0  -->
<!-- (the "License"); you may not use this file except in compliance with     -->
<!-- the License.  You may obtain a copy of the License at                    -->
<!--                                                                          -->
<!--     http://www.apache.org/licenses/LICENSE-2.0                           -->
<!--                                                                          -->
<!-- Unless required by applicable law or agreed to in writing, software      -->
<!-- distributed under the License is distributed on an "AS IS" BASIS,        -->
<!-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -->
<!-- See the License for the specific language governing permissions and      -->
<!-- limitations under the License.                                           -->

<configuration>

<!-- Hive Configuration can either be stored in this file or in the hadoop configuration files  -->
<!-- that are implied by Hadoop setup variables.                                                -->
<!-- Aside from Hadoop setup variables - this file is provided as a convenience so that Hive    -->
<!-- users do not have to edit hadoop configuration files (that may be managed as a centralized -->
<!-- resource).                                                                                 -->

<!-- Hive Execution Parameters -->




<property>
  <name>hbase.zookeeper.quorum</name>
  <value>wnode55.domain_name.com</value>
  <description>http://wiki.apache.org/hadoop/Hive/HBaseIntegration</description>
</property>


<property>
  <name>hive.execution.engine</name>
  <value>mr</value>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>org.apache.derby.jdbc.EmbeddedDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>hive.hwi.war.file</name>
  <value>/usr/lib/hive/lib/hive-hwi.war</value>
  <description>This is the WAR file with the jsp content for Hive Web Interface</description>
</property>

<property>
   <name>hive.server2.allow.user.substitution</name>
   <value>true</value>
</property>

<property>
   <name>hive.server2.enable.doAs</name>
   <value>true</value>
</property>

<property>
   <name>hive.server2.thrift.port</name>
   <value>10000</value>
</property>

<property>
   <name>hive.server2.thrift.http.port</name>
   <value>10001</value>
</property>


<property>
   <name>hive.metastore.uris</name>
   <value>thrift:/wnode55.domain_name.com:9083</value>
</property>


<property>
   <name>hive.security.metastore.authorization.manager</name>
   <value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
</property>

<property>
    <name>hive.server2.authentication</name>
    <value>KERBEROS</value>
</property>
<property>
    <name>hive.server2.authentication.kerberos.principal</name>
    <value>hive/_HOST@domain_name.COM</value>
</property>
<property>
    <name>hive.server2.authentication.kerberos.keytab</name>
    <value>/etc/hadoop/conf/hive.keytab</value>
</property>
</configuration>

Любая помощь будет принята с благодарностью.

...