Ошибка подключения Debezium Kafka - TimeoutException: истекло время ожидания при получении метаданных topi c - PullRequest
1 голос
/ 24 января 2020

Я получаю сообщение об ошибке на Debezium Connect, и не знаю, где я делаю ошибку или что-то упускаю. Ниже приведены свойства моего коннектора и файла docker. Возможно ли, что docker, развернутая на виртуальной машине, не может подключить ее к базе данных на другой виртуальной машине?

kafka-connect-10    | [2020-01-23 23:37:00,202] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:179)
kafka-connect-10    | org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
kafka-connect-10    | [2020-01-23 23:37:00,203] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:180)
kafka-connect-10    | [2020-01-23 23:37:00,205] INFO [Procura_CDC|task-0] [Producer clientId=procura-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10    | [2020-01-23 23:37:00,374] INFO [Procura_CDC|task-0] [Producer clientId=connector-producer-Procura_CDC-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10    | [2020-01-23 23:37:12,772] INFO [Procura_CDC|task-0|offsets] WorkerSourceTask{id=Procura_CDC-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:416)
kafka-connect-10    | [2020-01-23 23:37:12,773] INFO [Procura_CDC|task-0|offsets] WorkerSourceTask{id=Procura_CDC-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:433)
kafka-connect-10    | [2020-01-23 23:37:12,918] INFO [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:416)
kafka-connect-10    | [2020-01-23 23:37:12,930] INFO [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:433)
kafka-connect-10    | [2020-01-23 23:37:12,930] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:179)
kafka-connect-10    | org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
kafka-connect-10    | [2020-01-23 23:37:12,931] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:180)
kafka-connect-10    | [2020-01-23 23:37:12,932] INFO [Procura_CDC|task-0] [Producer clientId=procura-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10    | [2020-01-23 23:37:13,038] ERROR [Procura_CDC|task-0] Unable to unregister the MBean 'debezium.sql_server:type=connector-metrics,context=schema-history,server=procura' (io.debezium.relational.history.DatabaseHistoryMetrics:65)
kafka-connect-10    | [2020-01-23 23:37:13,039] INFO [Procura_CDC|task-0] [Producer clientId=connector-producer-Procura_CDC-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)

Docker Файл:

version: '3'
services:

  kafka-connect-02:
    image: confluentinc/cp-kafka-connect:latest
    container_name: kafka-connect-02
    ports:
      - 8083:8083
    environment:
      CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN: "[%d] %p %X{connector.context}%m (%c:%L)%n"
      CONNECT_CUB_KAFKA_TIMEOUT: 300
      CONNECT_BOOTSTRAP_SERVERS: "https://***9092"
      CONNECT_REST_ADVERTISED_HOST_NAME: 'kafka-connect-02'
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: _kafka-connect-group-01-v04
      CONNECT_CONFIG_STORAGE_TOPIC: _kafka-connect-group-01-v04-configs
      CONNECT_OFFSET_STORAGE_TOPIC: _kafka-connect-group-01-v04-offsets
      CONNECT_STATUS_STORAGE_TOPIC: _kafka-connect-group-01-v04-status
      CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
      CONNECT_KEY_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
      CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
      CONNECT_VALUE_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
      CONNECT_INTERNAL_KEY_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
      CONNECT_INTERNAL_VALUE_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
      CONNECT_LOG4J_ROOT_LOGLEVEL: 'INFO'
      CONNECT_LOG4J_LOGGERS: 'org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR'
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: '3'
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: '3'
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: '3'
      CONNECT_PLUGIN_PATH: '/usr/share/java,/usr/share/confluent-hub-components/'
      # Confluent Cloud config
      CONNECT_REQUEST_TIMEOUT_MS: "20000"
      CONNECT_RETRY_BACKOFF_MS: "500"
      CONNECT_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
      CONNECT_SASL_MECHANISM: "PLAIN"
      CONNECT_SECURITY_PROTOCOL: "SASL_SSL"
      CONNECT_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
      #
      CONNECT_CONSUMER_SECURITY_PROTOCOL: "SASL_SSL"
      CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
      CONNECT_CONSUMER_SASL_MECHANISM: "PLAIN"
      CONNECT_CONSUMER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
      CONNECT_CONSUMER_REQUEST_TIMEOUT_MS: "20000"
      CONNECT_CONSUMER_RETRY_BACKOFF_MS: "500"
      #
      CONNECT_PRODUCER_SECURITY_PROTOCOL: "SASL_SSL"
      CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
      CONNECT_PRODUCER_SASL_MECHANISM: "PLAIN"
      CONNECT_PRODUCER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
      CONNECT_PRODUCER_REQUEST_TIMEOUT_MS: "20000"
      CONNECT_PRODUCER_RETRY_BACKOFF_MS: "500"
      # External secrets config
      # See https://docs.confluent.io/current/connect/security.html#externalizing-secrets
      CONNECT_CONFIG_PROVIDERS: 'file'
      CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
    command: 
      - bash 
      - -c 
      - |
        echo "Installing connector plugins"
        confluent-hub install --no-prompt debezium/debezium-connector-sqlserver:0.10.0
        confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:0.5.5
        #
        echo "Launching Kafka Connect worker"
        /etc/confluent/docker/run &  

        #
        sleep infinity

Разъем Debezium:

 curl -i -X PUT -H "Content-Type:application/json" http://localhost:8083/connectors/Procura_CDC/config  -d '{  "connector.class":"io.debezium.connector.sqlserver.SqlServerConnector",
    "tasks.max":"1",
    "database.server.name":"***",
    "database.hostname":"***",
    "database.port":"***",
    "database.user":"Kafka",
    "database.password":"***",
    "database.dbname":"Procura_Prod",
    "database.history.kafka.bootstrap.servers":"*****",
    "database.history.kafka.topic":"dbhistory.procura",
    "table.whitelist":"dbo.CLIENTS,dbo.VISITS",
    "poll.interval.ms":"2000",
    "snapshot.fetch.size":"2000",
    "snapshot.mode":"initial",
    "snapshot.isolation.mode":"snapshot",
    "transforms":"unwrap,dropPrefix",
    "transforms.unwrap.type":"io.debezium.transforms.ExtractNewRecordState",
    "transforms.unwrap.drop.tombstones":"false",
    "transforms.unwrap.delete.handling.mode":"rewrite",
    "transforms.dropPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter",
    "transforms.dropPrefix.regex":"procura.dbo.(.*)",
    "transforms.dropPrefix.replacement":"$1" }'

Спасибо

1 Ответ

0 голосов
/ 28 января 2020

Ошибка говорит, что не удается подключиться к Кафке.

Следующим не следует применять к ним https://

  • CONNECT_BOOTSTRAP_SERVERS
  • CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM
  • CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM

На самом деле, даже не уверен, что последние два являются допустимыми конфигами.


Вы также можете удалить REQUEST_TIMEOUT_MS и RETRY_BACKOFF_MS конфиги, если у вас нет специального c варианта использования для установки этих


И AFAIK, CONFIG_PROVIDERS_FILE_CLASS не требуется, поскольку файл является реализацией по умолчанию.

Совет: используйте файл .env, чтобы уменьшить количество переменных, необходимых в YAML

...