Дубликат имени файла в GCP Storage - PullRequest
1 голос
/ 15 марта 2019

Я передаю файлы в GCP Storage (ведро).Это приводит к частой ошибке (примерно 2 миллиона раз в день), утверждающей, что моя политика имен файлов должна генерировать уникальное имя.Я пробовал несколько способов гарантировать уникальное имя, например, используя currentTimeMillis, currentThread, шифрование имени файла и т. Д. Все они, по-видимому, устраняют ошибку на пару часов / день, прежде чем она возвращается с той же частотой.Я не сталкивался с ситуацией, когда файл отсутствовал, но я также не проводил тщательный поиск.Это мой писатель:

pipeline.apply("Read PubSub Events", PubsubIO.readMessagesWithAttributes().fromSubscription(options.getInputSubscription())) 
                    .apply(options.getWindowDuration() + " Window",
                                Window.<PubsubMessage>into(FixedWindows.of(parseDuration(options.getWindowDuration())))
                                    .triggering(AfterWatermark.pastEndOfWindow())
                                    .discardingFiredPanes()
                                    .withAllowedLateness(parseDuration("24h")))
                    .apply(new GenericFunctions.extractMsg())
                    .apply(FileIO.<String, String>writeDynamic()
                            .by(new datePartition(options.getOutputFilenamePrefix()))
                            .via(TextIO.sink())
                            .withNumShards(options.getNumShards())
                            .to(options.getOutputDirectory())
                            .withNaming(type -> FileIO.Write.defaultNaming(type, ".txt"))
                            .withDestinationCoder(StringUtf8Coder.of())); 

Я могу рассказать о любых методах, не относящихся к Google, которые мы используем при необходимости.Это ошибка, которую возвращает StackDriver:

exception:  "java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.IllegalArgumentException: Filename policy must generate unique filenames, but generated the same name gs://myBucket/myProject/2019/01/21/13h/myProject-2123065519-2019-01-21T12:58:00.000Z-2019-01-21T13:00:00.000Z-00000-of-00001.txt for file results FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2e2c99ef-6508-4f10-ada4-e5c108b1d884, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} and FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2112c27f-8e0c-4831-b3fe-fbefe9ed560e, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}}
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:185)
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:102)
    at org.apache.beam.runners.dataflow.worker.StreamingGroupAlsoByWindowReshuffleFn.processElement(StreamingGroupAlsoByWindowReshuffleFn.java:58)
    at org.apache.beam.runners.dataflow.worker.StreamingGroupAlsoByWindowReshuffleFn.processElement(StreamingGroupAlsoByWindowReshuffleFn.java:40)
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:135)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:45)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:50)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:202)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:160)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1226)
    at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:141)
    at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:965)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.lang.IllegalArgumentException: Filename policy must generate unique filenames, but generated the same name gs://myBucket/myProject/2019/01/21/13h/myProject-2123065519-2019-01-21T12:58:00.000Z-2019-01-21T13:00:00.000Z-00000-of-00001.txt for file results FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2e2c99ef-6508-4f10-ada4-e5c108b1d884, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} and FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2112c27f-8e0c-4831-b3fe-fbefe9ed560e, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}}
    at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:34)
    at org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
    at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
    at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:326)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:45)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:50)
    at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:273)
    at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
    at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
    at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
    at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)
    at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
    at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
    at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:326)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:45)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:50)
    at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:273)
    at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
    at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
    at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:85)
    at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
    at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
    at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:326)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:45)
    at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:50)
    at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:183)
    ... 17 more
Caused by: java.lang.IllegalArgumentException: Filename policy must generate unique filenames, but generated the same name gs://myBucket/myProject/2019/01/21/13h/myProject-2123065519-2019-01-21T12:58:00.000Z-2019-01-21T13:00:00.000Z-00000-of-00001.txt for file results FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2e2c99ef-6508-4f10-ada4-e5c108b1d884, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} and FileResult{tempFilename=gs://myBucket/myProject/.temp-beam-2019-01-17_20-30-37-1/2112c27f-8e0c-4831-b3fe-fbefe9ed560e, shard=0, window=[2019-01-21T12:58:00.000Z..2019-01-21T13:00:00.000Z), paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}}
    at org.apache.beam.repackaged.beam_sdks_java_core.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
    at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.finalizeDestination(FileBasedSink.java:656)
    at org.apache.beam.sdk.io.WriteFiles.finalizeAllDestinations(WriteFiles.java:819)
    at org.apache.beam.sdk.io.WriteFiles.access$1600(WriteFiles.java:112)
    at org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:796)
"   

1 Ответ

3 голосов
/ 15 марта 2019

Я бы использовал методы, которые были созданы для этого. Например, UUID.randomUUID (). ToString () создает UUID типа 4, который можно использовать для уникального именования файлов.

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...