Я строю чтение конвейера Apache Beam из Kafka, используя KafkaIO
, но я не уверен, как решить проблему с сериализацией.
Как используется KafkaIO:
this.pipeline
.apply("ReadFromKafka",
KafkaIO
.<byte[], byte[]>read()
.withConsumerFactoryFn(input -> {
this.updateKafkaConsumerProperties(this.kafkaConsumerConfig, input);
return new KafkaConsumer<>(input);
})
.withBootstrapServers(kafkaConsumerConfig.getBootstrapServer())
.withTopic(this.pipelineSourceKafkaConfiguration.getOnboardingTopic())
.withKeyDeserializer(ByteArrayDeserializer.class)
.withValueDeserializer(ByteArrayDeserializer.class))
.apply("WindowTheData", Window.into(FixedWindows.of(Duration.standardSeconds(5))))
...
Но моя программа драйвера не смогла запуститься, выдавая следующее:
java.lang.IllegalArgumentException: unable to serialize org.apache.beam.sdk.io.kafka.KafkaUnboundedSource@65bd19bf
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:57)
at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:107)
at org.apache.beam.sdk.util.SerializableUtils.ensureSerializable(SerializableUtils.java:86)
at org.apache.beam.sdk.io.Read$Unbounded.<init>(Read.java:137)
at org.apache.beam.sdk.io.Read$Unbounded.<init>(Read.java:132)
at org.apache.beam.sdk.io.Read.from(Read.java:55)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:665)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:277)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:491)
at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:188)
at com.company.lib.pipelines.DataPersistencePipeline.execute(DataPersistencePipeline.java:64)
at com.company.app.MainApp.registerPipelineEndpoints(MainApp.java:102)
at com.company.app.MainApp.run(MainApp.java:81)
at com.company.app.MainApp.run(MainApp.java:44)
at io.dropwizard.cli.EnvironmentCommand.run(EnvironmentCommand.java:43)
at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:87)
at io.dropwizard.cli.Cli.run(Cli.java:78)
at io.dropwizard.Application.run(Application.java:93)
at com.company.app.MainApp.main(MainApp.java:51)
Caused by: java.io.NotSerializableException: com.company.lib.pipelines.DataPersistencePipeline
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
... 20 more
Исключение жалуется на то, что org.apache.beam.sdk.io.kafka.KafkaUnboundedSource
объект не сериализуем.
Этот класс взят из Apache BeamSDK и он на самом деле реализует интерфейс Serializable
.Не уверен, где я сделал что-то не так.