Kinesis Binder Default Емкость чтения и записи в таблицы динамо-базы данных - PullRequest
0 голосов
/ 04 мая 2019

Согласно документации в Spring Aws Kinesis Binder, значения по умолчанию для readCapacity и writeCapacity составляют 1

https://github.com/spring-cloud/spring-cloud-stream-binder-aws-kinesis/blob/master/spring-cloud-stream-binder-kinesis-docs/src/main/asciidoc/overview.adoc#lockregistry

readCapacity Емкость чтения таблицы DynamoDb.См. Пропускная способность Kinesis

По умолчанию: 1

writeCapacity Емкость записи таблицы DynamoDb.См. Пропускная способность Kinesis по умолчанию: 1

Из кода клиента Kinesis я вижу, что значения по умолчанию - 10.

https://github.com/awslabs/amazon-kinesis-client/blob/master/amazon-kinesis-client/src/main/java/software/amazon/kinesis/leases/dynamodb/TableConstants.java

Есть ли какие-либо настройкив Spring Kinesis Binder.

Редактировать

У меня есть таблица блокировок с возможностью чтения и записи 40.

Iнастроил мое связующее как это

spring: 
  cloud:
    stream:
      kinesis:
        binder:
          locks:
            table: customLocks
            readCapacity: 5
            writeCapacity: 2
          checkpoint:
            table: customCheckPoints
            readCapacity: 5
            writeCapacity: 2
        bindings:
          inputone:
            consumer:
              listenerMode: batch
              idleBetweenPolls: 500
              recordsLimit: 50
          inputtwo:
            consumer:
              listenerMode: batch
              idleBetweenPolls: 500
              recordsLimit: 50
      bindings:
        inputone:
          group: my-group-1
          destination: stream-1
          content-type: application/json
        inputtwo:
          group: my-group-2
          destination: stream-2
          content-type: application/json

У меня три контейнера, работающих с этими конфигурациями.

Я вижу ProvisionedThroughputExceededException для таблицы customLocks.

Не уверен, что связыватель пытается перегрузить таблицы блокировки динамо-базы данных.

2019-05-05 07:49:52.216  WARN --- [-kinesis-shard-locks-1] ices.dynamodbv2.AmazonDynamoDBLockClient : Could not acquire lock because of a client side failure in talking to DDB
com.amazonaws.services.dynamodbv2.model.ProvisionedThroughputExceededException: The level of configured provisioned throughput for the table was exceeded. Consider increasing your provisioning level with the UpdateTable API. (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ProvisionedThroughputExceededException; Request ID: 94CURTLH858HM3RRELMSB6J817VV4KQNSO5AEMVJF66Q9ASUAAJG)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1632)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1058)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:3452)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:3428)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.executeGetItem(AmazonDynamoDBClient.java:1789)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.getItem(AmazonDynamoDBClient.java:1764)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBLockClient.readFromDynamoDB(AmazonDynamoDBLockClient.java:997)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBLockClient.getLockFromDynamoDB(AmazonDynamoDBLockClient.java:743)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBLockClient.acquireLock(AmazonDynamoDBLockClient.java:402)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBLockClient.tryAcquireLock(AmazonDynamoDBLockClient.java:567)
    at org.springframework.integration.aws.lock.DynamoDbLockRegistry$DynamoDbLock.doLock(DynamoDbLockRegistry.java:504)
    at org.springframework.integration.aws.lock.DynamoDbLockRegistry$DynamoDbLock.tryLock(DynamoDbLockRegistry.java:478)
    at org.springframework.integration.aws.lock.DynamoDbLockRegistry$DynamoDbLock.tryLock(DynamoDbLockRegistry.java:452)
    at org.springframework.integration.aws.inbound.kinesis.KinesisMessageDrivenChannelAdapter$ShardConsumerManager.lambda$run$0(KinesisMessageDrivenChannelAdapter.java:1198)
    at org.springframework.integration.aws.inbound.kinesis.KinesisMessageDrivenChannelAdapter$ShardConsumerManager.dt_access$257(KinesisMessageDrivenChannelAdapter.java)
    at java.util.Collection.removeIf(Collection.java:414)
    at org.springframework.integration.aws.inbound.kinesis.KinesisMessageDrivenChannelAdapter$ShardConsumerManager.run(KinesisMessageDrivenChannelAdapter.java:1191)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:748)

1 Ответ

0 голосов
/ 04 мая 2019

Мне кажется, чем больше возможностей, тем больше вы платите в своем аккаунте AWS.Такая конфигурация действительно может быть изменена с помощью application.properties:

spring.cloud.stream.kinesis.binder.locks.readCapacity = 10
spring.cloud.stream.kinesis.binder.locks.writeCapacity = 10

И именно это объясняется в этом документе для Kinesis Binder.

...