Керас мультиклассовая классификация проблема размера input_shape - PullRequest
0 голосов
/ 16 июня 2020

Я создаю AI для распознавания лиц с помощью Keras. Я собрал данные в следующей форме:

набор данных - a - b - c - d main.py

Здесь a, b, c и d являются классы, каждый из которых содержит изображения. Я хочу использовать команду image_dataset_from_directory, но когда я пытаюсь изменить размер всех изображений до 400x400, появляется сообщение о том, что формы 160000 и 400x400 несовместимы. Но если я изменю размер изображения до 1,160000, все будет работать, но с плохой точностью из-за искаженного изображения. Любая помощь приветствуется.

Я использую python 3.7 с tenorflow 2.3 и tf-nightly

Вот ошибка:

WARNING:tensorflow:Model was constructed with shape (None, 400, 400) for input Tensor("flatten_input:0", shape=(None, 400, 400), dtype=float32), but it was called on an input with incompatible shape (400, 400, 1).
Traceback (most recent call last):
  File "C:\Users\\PycharmProjects\custom face\mainm.py", line 48, in <module>
    prediction = model.predict_on_batch(pred[1][0])
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\training.py", line 1766, in predict_on_batch
    outputs = predict_function(iterator)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\def_function.py", line 780, in __call__
    result = self._call(*args, **kwds)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\def_function.py", line 823, in _call
    self._initialize(args, kwds, add_initializers_to=initializers)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\def_function.py", line 697, in _initialize
    *args, **kwds))
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\function.py", line 2841, in _get_concrete_function_internal_garbage_collected
    graph_function, _, _ = self._maybe_define_function(args, kwargs)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\function.py", line 3199, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\function.py", line 3061, in _create_graph_function
    capture_by_value=self._capture_by_value),
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\framework\func_graph.py", line 979, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\eager\def_function.py", line 600, in wrapped_fn
    return weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\framework\func_graph.py", line 966, in wrapper
    raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\training.py:1440 predict_function  *
        return step_function(self, iterator)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\training.py:1430 step_function  **
        outputs = model.distribute_strategy.run(run_step, args=(data,))
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:1063 run
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2377 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2736 _call_for_each_replica
        return fn(*args, **kwargs)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\training.py:1423 run_step  **
        outputs = model.predict_step(data)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\training.py:1396 predict_step
        return self(x, training=False)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\base_layer.py:961 __call__
        outputs = call_fn(inputs, *args, **kwargs)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\sequential.py:372 call
        return super(Sequential, self).call(inputs, training=training, mask=mask)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\functional.py:385 call
        inputs, training=training, mask=mask)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\functional.py:507 _run_internal_graph
        outputs = node.layer(*args, **kwargs)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\base_layer.py:952 __call__
        self.name)
    C:\Users\\PycharmProjects\custom face\venv\lib\site-packages\tensorflow\python\keras\engine\input_spec.py:216 assert_input_compatibility
        ' but received input with shape ' + str(shape))

    ValueError: Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape to have value 160000 but received input with shape [400, 400]

И вот мой код :

from libraries import *

tf.compat.v1.enable_eager_execution()
gpu_options = tf.compat.v1.GPUOptions(per_process_gpu_memory_fraction=0.9)
sess = tf.compat.v1.Session(config=tf.compat.v1.ConfigProto(gpu_options=gpu_options))


data = tf.keras.preprocessing.image_dataset_from_directory("dataset", labels="inferred", label_mode="categorical",class_names=['a','b','c','d'], color_mode='grayscale', image_size=(400,400), )

labels = ['k','am','ash','an']

model = keras.Sequential([ #sequential says to list layers in sequence
    keras.layers.Flatten(input_shape=(400,400)), #Input layer setup. making data from [[1],[2],[3]] to [1,2,3]
    keras.layers.Dense(128, activation="relu"), #1st hidden layer of 128 neurons, Dense means all layers interconnected equally. Activation is the rectifier linear
    # unit making the proccesing complex. It does this by making all negetives 0 and all positives bigger.
    keras.layers.Dense(128, activation="relu"),
    keras.layers.Dense(100, activation="relu"),
    keras.layers.Dense(4, activation="softmax")#softmaxout layer of 10 neurons, activation gives a probability for each of the 10 output choices
])

# Compile up

model.compile(optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"])# loss function is where you look at the probability and

# change weights and biases based on how much you lost. metric is what you want to achieve.

#Train
checkpoint_path = "training_1/cp.ckpt"
checkpoint_dir = os.path.dirname(checkpoint_path)

# Create a callback that saves the model's weights
cp_callback = tf.keras.callbacks.ModelCheckpoint(filepath=checkpoint_path,
                                                 save_weights_only=True,
                                                 verbose=1)

#model.fit(data,epochs=1, callbacks = [cp_callback]) #epochs are how many times the network sees the same data during training. feeds data randomly. gives

model.load_weights("training_1/cp.ckpt")

test_loss, test_acc = model.evaluate(data)
print("Tested Loss: ", test_loss)
print("Tested Acc: ", test_acc*100, "%")

unbatched = data.unbatch()
print("================")
pred = list(unbatched.as_numpy_iterator())
print(len(pred))
prediction = model.predict_on_batch(pred[1][0])
# img = Image.fromarray(pred[0][0], 'RGB')
# img.save('my.png')
# img.show()
print(prediction)

1 Ответ

1 голос
/ 16 июня 2020

keras.layers.Flatten (...

Вы не можете начать модель с этого слоя, он принимает только векторы. Используйте Conv2D, а затем следующие слои. Лучше всего если вы увидите больше примеров работающих моделей. Это позволит вам получить лучшие результаты и делать меньше ошибок.

пример: https://keras.io/examples/mnist_cnn/

...