Не можете использовать разные KDC и области для Kafka Connect и HDFS Sink Connector? - PullRequest
0 голосов
/ 22 марта 2019

Я настроил свой Kafka Connect с помощью Kerberized Kafka Cluster. (Скажем, KDC это "kafka-auth101.hadoop.local" и царство "KAFKA.MYCOMPANY.COM")

Теперь я пытаюсь настроить HDFS Sink для записи в Kerberized Hadoop Cluster с другим KDC (скажем, KDC - это «hadoop-auth101.hadoop.local» и область) HADOOP.MYCOMPANY.COM "

Я добавил обе эти области в krb5.conf, используемый Kafka Connect.

Но во время инициализации происходит сбой экземпляра соединителя приемника HDFS, выдавая ошибку

Какие-нибудь советы по этому поводу? В основном с этой конфигурацией одна JVM пытается использовать разные KDC и Realms.

>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
>>> KeyTabInputStream, readName(): hdfsuser
>>> KeyTab: load() entry length: 85; type: 18
Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
[2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
(io.confluent.connect.hdfs.HdfsSinkTask)
org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
/etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
Unable to obtain password from user

at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Login failure for
hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
javax.security.auth.login.LoginException: Unable to obtain password from
user

at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
... 10 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain
password from user

krb5.conf выглядит так:

[logging]
   kdc = FILE:/var/log/krb5/krb5kdc.log
   admin_server = FILE:/var/log/krb5/kadmin.log
   default = FILE:/var/log/krb5/krb5libs.log

[libdefaults]
 default_realm = KAFKA.MYCOMPANY.COM
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 forwardable = yes
 allow_weak_crypto = true
 renew_lifetime = 7d
 kdc_timeout = 3000
 max_retries = 2
 clockskew = 120
 default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
 default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
 permitted_enctypes   = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



[realms]
 # KDC,Realm for Kafka
 KAFKA.MYCOMPANY.COM = {
  kdc = kafka-auth101.hadoop.local
  admin_server = kafka-auth101.hadoop.local:2749
 }

 # KDC,Realm for Hadoop/HDFS
 HADOOP.MYCOMPANY.COM = {
  kdc = hadoop-auth101.hadoop.local
  admin_server = hadoop-auth101.hadoop.local:2749
 }

[appdefaults]
 pam = {
   debug = false
   ticket_lifetime = 36000
   renew_lifetime = 36000
   forwardable = true
   krb4_convert = false
 }
...