Я создаю усиление DNN (DQN), но при отправке моих данных в модель получена следующая ошибка:
ValueError: Ошибка при проверке цели: ожидается, что плотность_2 имеет 2 измерения, но получен массив с формой (64, 4, 1)
Я использую этот вход (1,139) с размером мини-пакета 64, что делает его: (64,1139).
def create_model(self):
model = Sequential()
model.add(LSTM(128, input_shape=(1,139), return_sequences=True, stateful=False))
model.add(Dropout(0.2))
model.add(BatchNormalization())
model.add(LSTM(128, return_sequences=True, stateful=False))
model.add(Dropout(0.2))
model.add(BatchNormalization())
model.add(Dense(32, activation='relu'))
model.add(Dropout(0.2))
model.add(Flatten())
model.add(Dense(4, activation='softmax'))
#Model compile settings:
opt = tf.keras.optimizers.Adam(lr=0.001, decay=1e-6)
# Compile model
model.compile(loss='sparse_categorical_crossentropy', optimizer=opt, metrics=['accuracy'])
Я запустил сводку по модели:
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 1, 128) 137216
_________________________________________________________________
dropout_1 (Dropout) (None, 1, 128) 0
_________________________________________________________________
batch_normalization_1 (Batch (None, 1, 128) 512
_________________________________________________________________
lstm_2 (LSTM) (None, 1, 128) 131584
_________________________________________________________________
dropout_2 (Dropout) (None, 1, 128) 0
_________________________________________________________________
batch_normalization_2 (Batch (None, 1, 128) 512
_________________________________________________________________
dense_1 (Dense) (None, 1, 32) 4128
_________________________________________________________________
dropout_3 (Dropout) (None, 1, 32) 0
_________________________________________________________________
flatten_1 (Flatten) (None, 32) 0
_________________________________________________________________
dense_2 (Dense) (None, 4) 132
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_3 (LSTM) (None, 1, 128) 137216
_________________________________________________________________
dropout_4 (Dropout) (None, 1, 128) 0
_________________________________________________________________
batch_normalization_3 (Batch (None, 1, 128) 512
_________________________________________________________________
lstm_4 (LSTM) (None, 1, 128) 131584
_________________________________________________________________
dropout_5 (Dropout) (None, 1, 128) 0
_________________________________________________________________
batch_normalization_4 (Batch (None, 1, 128) 512
_________________________________________________________________
dense_3 (Dense) (None, 1, 32) 4128
_________________________________________________________________
dropout_6 (Dropout) (None, 1, 32) 0
_________________________________________________________________
flatten_2 (Flatten) (None, 32) 0
_________________________________________________________________
dense_4 (Dense) (None, 4) 132
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None
Разве плоский слой не должен сделать его двумерным массивом? Любые идеи? : - /