Добавление уровня пакетной нормализации в модель декодера lstm-кодера - PullRequest
1 голос
/ 07 апреля 2019

Меня интересует, как добавить слой BatchNormalization в модель LSTM encoder decoder. У меня есть код для модели LSTM encoder decoder, который делает прогнозирование временных рядов.

num_features = X_train.shape[2]
# Define an input series and encode it with an LSTM. 
encoder_inputs = Input(shape=(None, num_features)) 
encoder = LSTM(units_size, return_state=True, dropout=dropout)
encoder_outputs, state_h, state_c = encoder(encoder_inputs)

# We discard `encoder_outputs` and only keep the final states. These represent the "context"
# vector that we use as the basis for decoding.
encoder_states = [state_h, state_c]

# Set up the decoder, using `encoder_states` as initial state.
# This is where teacher forcing inputs are fed in.
decoder_inputs = Input(shape=(None, 1)) 

# We set up our decoder using `encoder_states` as initial state.  
# We return full output sequences and return internal states as well. 
# We don't use the return states in the training model, but we will use them in inference.
decoder_lstm = LSTM(units_size, return_sequences=True, return_state=True, dropout=dropout)
decoder_outputs, _, _ = decoder_lstm(decoder_inputs,
                                     initial_state=encoder_states)

decoder_dense = Dense(1) # 1 continuous output at each timestep
decoder_outputs = decoder_dense(decoder_outputs)

# Define the model that will turn
# `encoder_input_data` & `decoder_input_data` into `decoder_target_data`
model = Model([encoder_inputs, decoder_inputs], decoder_outputs)
model.compile(Adam(lr = learning_rate), loss='mean_absolute_error')

Я хотел бы добавить слой BatchNormalization в часть декодера. Но я не знаю, должен ли я это использовать. Буду признателен за любую помощь.

...