Tensorflow выбрасывает ValueError () после перехода в первую эпоху - PullRequest
2 голосов
/ 09 мая 2020

Я испытываю ошибку «ValueError: Shapes (None, None) and (None, 8, 8, 7) are несовместимы» каждый раз, когда я тренирую свою модель в Tensorflow. Пока:

history = model.fit(train_batches,
                    steps_per_epoch=train_steps,
                    class_weight=class_weights,
                    validation_data=validation_batches,
                    validation_steps=val_steps,
                    epochs=30,
                    verbose=1,
                    callbacks=callbacks_list
                    )

дает следующую трассировку стека:

Traceback (most recent call last):
  File "/home/brian/Desktop/381-deep-learning/main.py", line 410, in <module>
    epochs=30
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py", line 1479, in fit_generator
    initial_epoch=initial_epoch)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py", line 66, in _method_wrapper
    return method(self, *args, **kwargs)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py", line 848, in fit
    tmp_logs = train_function(iterator)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 580, in __call__
    result = self._call(*args, **kwds)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 627, in _call
    self._initialize(args, kwds, add_initializers_to=initializers)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 506, in _initialize
    *args, **kwds))
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 2446, in _get_concrete_function_internal_garbage_collected
    graph_function, _, _ = self._maybe_define_function(args, kwargs)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 2777, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/function.py", line 2667, in _create_graph_function
    capture_by_value=self._capture_by_value),
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/framework/func_graph.py", line 981, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/eager/def_function.py", line 441, in wrapped_fn
    return weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "/home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/framework/func_graph.py", line 968, in wrapper
    raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/distribute/distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/distribute/distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/distribute/distribute_lib.py:2649 _call_for_each_replica
        return fn(*args, **kwargs)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py:533 train_step  **
        y, y_pred, sample_weight, regularization_losses=self.losses)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/engine/compile_utils.py:205 __call__
        loss_value = loss_obj(y_t, y_p, sample_weight=sw)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/losses.py:143 __call__
        losses = self.call(y_true, y_pred)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/losses.py:246 call
        return self.fn(y_true, y_pred, **self._fn_kwargs)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/losses.py:1527 categorical_crossentropy
        return K.categorical_crossentropy(y_true, y_pred, from_logits=from_logits)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/keras/backend.py:4561 categorical_crossentropy
        target.shape.assert_is_compatible_with(output.shape)
    /home/brian/Desktop/381-deep-learning/venv/lib/python3.6/site-packages/tensorflow/python/framework/tensor_shape.py:1117 assert_is_compatible_with
        raise ValueError("Shapes %s and %s are incompatible" % (self, other))

    ValueError: Shapes (None, None) and (None, 8, 8, 7) are incompatible


Process finished with exit code 1

после перехода к эпохе 1/30.

Вот мое определение модели на случай, если кому-то интересно :

# Create Inception Res Net model as used in paper
resnet = tf.keras.applications.inception_resnet_v2.InceptionResNetV2()

print("Layers of ResNet: "+str(len(resnet.layers))) //782 layers

x = resnet.layers[-28].output

x = tf.keras.layers.Dropout(0.25)(x)

# Make a prediction layer with 7 nodes for the 7 dir in our train_dir.
predictions_layer = tf.keras.layers.Dense(7, activation='softmax')(x)

# print(resnet.input)

# inputs=resnet.input selects the input layer, outputs=predictions refers to the
# dense layer we created above.

model = tf.keras.Model(inputs=resnet.input, outputs=predictions_layer)

Я считаю, что причиной моей проблемы может быть объявление моей модели, потому что когда я наблюдаю за своей моделью model.summary (), я вижу следующее (конечно, со всеми промежуточными слоями, исключенными):

Вывод model.summary ()

input_1 (InputLayer)            [(None, 299, 299, 3) 0       
__________________________________________________________________________________________________
dropout (Dropout)               (None, 8, 8, 192)    0           batch_normalization_195[0][0]    
__________________________________________________________________________________________________               
dense (Dense)                   (None, 8, 8, 7)      1351        dropout[0][0]                    
==================================================================================================
Total params: 47,465,959
Trainable params: 47,411,559
Non-trainable params: 54,400

Я включил pastebin всего файла на случай, если что-то пропустил: https://pastebin.com/raw/E0VQ83JQ

Я понимаю, что он ожидает тип (None, None), и мой выходной слой отправляется на плотный слой формы (None, 8, 8, 7), но как мне изменить форму?

Приветствуется любая помощь, включая документацию, которая, как вы думаете, будет полезна по этому вопросу.

Ответы [ 2 ]

2 голосов
/ 09 мая 2020

Между выводом Re sNet и плотным слоем должен быть плоский слой.

# Create Inception Res Net model as used in paper

resnet = tf.keras.applications.inception_resnet_v2.InceptionResNetV2()

print("Layers of ResNet: "+str(len(resnet.layers))) //782 layers

x = resnet.layers[-28].output

x = tf.keras.layers.Dropout(0.25)(x)

### Edit here.
x = tf.keras.layers.Flatten()(x)
# Make a prediction layer with 7 nodes for the 7 dir in our train_dir.
predictions_layer = tf.keras.layers.Dense(7, activation='softmax')(x)

# print(resnet.input)

# inputs=resnet.input selects the input layer, outputs=predictions refers to the
# dense layer we created above.

model = tf.keras.Model(inputs=resnet.input, outputs=predictions_layer)

Также убедитесь, что параметр train_batches действителен.

0 голосов
/ 09 мая 2020

Сначала преобразуйте X и Y в массив numpy, затем преобразуйте x_train, y_train в 'Standardscale', который покрывает большие значения маленькими

X=X.values
y=y.values

from sklearn.preprocessing import StandardScaler
scaler=StandardScaler()
X_train=scaler.fit_transform(X_train)
X_test=scaler.fit_transform(X_test)

Теперь примените model.fit

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...