Как исправить «ожидаемый плотный_2 иметь 2 измерения» - PullRequest
0 голосов
/ 25 июня 2019

Я пытаюсь закодировать MNIST.У него есть набор данных, который содержит информацию о пикселях данных изображения.

У меня проблема с размером в этом коде.Я новичок в Python, не знаю, что делать с проблемой измерения.

from keras.datasets import mnist
(train_images,train_labels),(test_images,test_labels)=mnist.load_data()

from keras import models
from keras import layers

network=models.Sequential()
network.add(layers.Dense(512,activation='relu',input_shape=(28*28,)))
network.add(layers.Dense(10,activation='softmax'))

network.compile(optimizer='rmsprop',
           loss='categorical_crossentropy',
           metrics=['accuracy'])

train_images=train_images.reshape((60000,28*28))
train_images=train_images.astype('float32')/255

test_images=test_images.reshape((10000,28*28))
test_images=test_images.astype('float32')/255

from keras.utils import to_categorical

train_labels=to_categorical(train_labels)
test_labels=to_categorical(train_labels)

network.fit(train_images, train_labels, epochs=10,batch_size=128)

test_loss,test_acc=network.evaluate(test_images,test_labels)

Ошибка, которую я получаю:

ValueError: Ошибка при проверке цели: ожидаетсяГустой_2 имеет 2 измерения, но получил массив с формой (60000, 10, 2)

Как это решить?Я не получаю его решение.Пожалуйста, помогите мне.

Ответы [ 3 ]

0 голосов
/ 25 июня 2019

Параметры, переданные в to_categorical(), являются причиной, по которой вы получаете вышеуказанную ошибку. Итак, попробуйте изменить

train_labels=to_categorical(train_labels)
test_labels=to_categorical(train_labels)

до

train_labels=to_categorical(train_labels, 10)
test_labels=to_categorical(test_labels, 10)
0 голосов
/ 25 июня 2019

Попробуйте сделать это таким образом, без изменения формы test_images Я переименую их просто для соглашения

    from keras import models
    from keras import layers
    from keras.datasets import mnist
    from keras.utils import to_categorical

    (x_train, y_train), (x_test, y_test) = mnist.load_data()

    x_train = x_train.reshape((60000, 28 * 28))
    x_train = x_train.astype('float32') / 255
    y_train = to_categorical(y_train)

    # model
    network = models.Sequential()
    network.add(layers.Dense(512, activation='relu', input_shape=(28 * 28,)))
    network.add(layers.Dense(10, activation='softmax'))

    network.compile(optimizer='rmsprop',
                    loss='categorical_crossentropy',
                    metrics=['accuracy'])

    print(x_train.ndim)    # 2
    print(x_train.shape)   # (60000, 784)

    print(x_test.ndim)     # 3
    print(x_test.shape)    # (10000, 28, 28)

    print(y_train.ndim)    # 2
    print(y_train.shape)   # (60000, 10)

    print(y_test.ndim)     # 1
    print(y_test.shape)    # (10000,)

    network.fit(x_train, y_train, epochs=10, batch_size=128)

Вывод

      128/60000 [..............................] - ETA: 2:39 - loss: 2.3697 - acc: 0.1406
     1152/60000 [..............................] - ETA: 20s - loss: 1.2849 - acc: 0.6285 
     2560/60000 [>.............................] - ETA: 10s - loss: 0.9101 - acc: 0.7441
     3968/60000 [>.............................] - ETA: 7s - loss: 0.7705 - acc: 0.7815 
     5248/60000 [=>............................] - ETA: 5s - loss: 0.6864 - acc: 0.8043
     6528/60000 [==>...........................] - ETA: 4s - loss: 0.6268 - acc: 0.8202
     7808/60000 [==>...........................] - ETA: 4s - loss: 0.5903 - acc: 0.8295
     9216/60000 [===>..........................] - ETA: 3s - loss: 0.5513 - acc: 0.8409
    10496/60000 [====>.........................] - ETA: 3s - loss: 0.5221 - acc: 0.8491
    11904/60000 [====>.........................] - ETA: 3s - loss: 0.4945 - acc: 0.8576
    13312/60000 [=====>........................] - ETA: 3s - loss: 0.4764 - acc: 0.8629
    14592/60000 [======>.......................] - ETA: 2s - loss: 0.4584 - acc: 0.8682
    16000/60000 [=======>......................] - ETA: 2s - loss: 0.4428 - acc: 0.8724
    17408/60000 [=======>......................] - ETA: 2s - loss: 0.4298 - acc: 0.8758
    18816/60000 [========>.....................] - ETA: 2s - loss: 0.4181 - acc: 0.8792
    20224/60000 [=========>....................] - ETA: 2s - loss: 0.4058 - acc: 0.8828
    21120/60000 [=========>....................] - ETA: 2s - loss: 0.3996 - acc: 0.8847
    21888/60000 [=========>....................] - ETA: 2s - loss: 0.3934 - acc: 0.8865
    22784/60000 [==========>...................] - ETA: 2s - loss: 0.3856 - acc: 0.8889
    23808/60000 [==========>...................] - ETA: 2s - loss: 0.3799 - acc: 0.8907
    24960/60000 [===========>..................] - ETA: 1s - loss: 0.3734 - acc: 0.8925
    26368/60000 [============>.................] - ETA: 1s - loss: 0.3649 - acc: 0.8951
    27776/60000 [============>.................] - ETA: 1s - loss: 0.3577 - acc: 0.8968
    29184/60000 [=============>................] - ETA: 1s - loss: 0.3513 - acc: 0.8990
    30464/60000 [==============>...............] - ETA: 1s - loss: 0.3461 - acc: 0.9007
    31872/60000 [==============>...............] - ETA: 1s - loss: 0.3391 - acc: 0.9023
    33280/60000 [===============>..............] - ETA: 1s - loss: 0.3336 - acc: 0.9037
    34688/60000 [================>.............] - ETA: 1s - loss: 0.3280 - acc: 0.9051
    35968/60000 [================>.............] - ETA: 1s - loss: 0.3231 - acc: 0.9065
    37248/60000 [=================>............] - ETA: 1s - loss: 0.3188 - acc: 0.9078
    38528/60000 [==================>...........] - ETA: 1s - loss: 0.3131 - acc: 0.9095
    39936/60000 [==================>...........] - ETA: 1s - loss: 0.3081 - acc: 0.9109
    41216/60000 [===================>..........] - ETA: 0s - loss: 0.3034 - acc: 0.9123
    42496/60000 [====================>.........] - ETA: 0s - loss: 0.2993 - acc: 0.9134
    43648/60000 [====================>.........] - ETA: 0s - loss: 0.2960 - acc: 0.9145
    44544/60000 [=====================>........] - ETA: 0s - loss: 0.2929 - acc: 0.9154
    45312/60000 [=====================>........] - ETA: 0s - loss: 0.2900 - acc: 0.9162
    46208/60000 [======================>.......] - ETA: 0s - loss: 0.2872 - acc: 0.9170
    46976/60000 [======================>.......] - ETA: 0s - loss: 0.2859 - acc: 0.9174
    48000/60000 [=======================>......] - ETA: 0s - loss: 0.2831 - acc: 0.9180
    49280/60000 [=======================>......] - ETA: 0s - loss: 0.2800 - acc: 0.9190
    50560/60000 [========================>.....] - ETA: 0s - loss: 0.2768 - acc: 0.9197
    51840/60000 [========================>.....] - ETA: 0s - loss: 0.2749 - acc: 0.9203
    53120/60000 [=========================>....] - ETA: 0s - loss: 0.2719 - acc: 0.9211
    54400/60000 [==========================>...] - ETA: 0s - loss: 0.2692 - acc: 0.9219
    55808/60000 [==========================>...] - ETA: 0s - loss: 0.2661 - acc: 0.9227
    57216/60000 [===========================>..] - ETA: 0s - loss: 0.2634 - acc: 0.9236
    58496/60000 [============================>.] - ETA: 0s - loss: 0.2607 - acc: 0.9244
    59904/60000 [============================>.] - ETA: 0s - loss: 0.2579 - acc: 0.9253
    60000/60000 [==============================] - 3s 48us/step - loss: 0.2576 - acc: 0.9254
    Epoch 2/10

...

    Epoch 10/10
      128/60000 [..............................] - ETA: 2s - loss: 0.0089 - acc: 0.9922
     1280/60000 [..............................] - ETA: 2s - loss: 0.0095 - acc: 0.9961
     2560/60000 [>.............................] - ETA: 2s - loss: 0.0071 - acc: 0.9977
     3840/60000 [>.............................] - ETA: 2s - loss: 0.0079 - acc: 0.9977
     4992/60000 [=>............................] - ETA: 2s - loss: 0.0077 - acc: 0.9976
     6272/60000 [==>...........................] - ETA: 2s - loss: 0.0073 - acc: 0.9976
     7552/60000 [==>...........................] - ETA: 2s - loss: 0.0074 - acc: 0.9975
     8448/60000 [===>..........................] - ETA: 2s - loss: 0.0073 - acc: 0.9974
     9728/60000 [===>..........................] - ETA: 2s - loss: 0.0079 - acc: 0.9972
    11008/60000 [====>.........................] - ETA: 2s - loss: 0.0088 - acc: 0.9970
    12160/60000 [=====>........................] - ETA: 2s - loss: 0.0090 - acc: 0.9970
    13440/60000 [=====>........................] - ETA: 2s - loss: 0.0093 - acc: 0.9969
    14720/60000 [======>.......................] - ETA: 1s - loss: 0.0093 - acc: 0.9971
    16128/60000 [=======>......................] - ETA: 1s - loss: 0.0093 - acc: 0.9972
    17024/60000 [=======>......................] - ETA: 1s - loss: 0.0093 - acc: 0.9972
    17664/60000 [=======>......................] - ETA: 1s - loss: 0.0092 - acc: 0.9973
    18560/60000 [========>.....................] - ETA: 1s - loss: 0.0102 - acc: 0.9972
    19328/60000 [========>.....................] - ETA: 1s - loss: 0.0102 - acc: 0.9971
    20096/60000 [=========>....................] - ETA: 1s - loss: 0.0102 - acc: 0.9971
    21504/60000 [=========>....................] - ETA: 1s - loss: 0.0100 - acc: 0.9972
    22784/60000 [==========>...................] - ETA: 1s - loss: 0.0096 - acc: 0.9973
    24192/60000 [===========>..................] - ETA: 1s - loss: 0.0094 - acc: 0.9974
    25344/60000 [===========>..................] - ETA: 1s - loss: 0.0093 - acc: 0.9974
    26624/60000 [============>.................] - ETA: 1s - loss: 0.0094 - acc: 0.9974
    27904/60000 [============>.................] - ETA: 1s - loss: 0.0095 - acc: 0.9974
    29312/60000 [=============>................] - ETA: 1s - loss: 0.0096 - acc: 0.9974
    30592/60000 [==============>...............] - ETA: 1s - loss: 0.0096 - acc: 0.9973
    31872/60000 [==============>...............] - ETA: 1s - loss: 0.0095 - acc: 0.9974
    33152/60000 [===============>..............] - ETA: 1s - loss: 0.0096 - acc: 0.9974
    34432/60000 [================>.............] - ETA: 1s - loss: 0.0095 - acc: 0.9974
    35840/60000 [================>.............] - ETA: 1s - loss: 0.0096 - acc: 0.9973
    36992/60000 [=================>............] - ETA: 1s - loss: 0.0095 - acc: 0.9974
    38272/60000 [==================>...........] - ETA: 0s - loss: 0.0095 - acc: 0.9974
    38784/60000 [==================>...........] - ETA: 0s - loss: 0.0094 - acc: 0.9974
    39680/60000 [==================>...........] - ETA: 0s - loss: 0.0094 - acc: 0.9973
    40448/60000 [===================>..........] - ETA: 0s - loss: 0.0095 - acc: 0.9973
    41216/60000 [===================>..........] - ETA: 0s - loss: 0.0095 - acc: 0.9973
    42240/60000 [====================>.........] - ETA: 0s - loss: 0.0095 - acc: 0.9973
    43520/60000 [====================>.........] - ETA: 0s - loss: 0.0096 - acc: 0.9973
    44800/60000 [=====================>........] - ETA: 0s - loss: 0.0095 - acc: 0.9973
    46080/60000 [======================>.......] - ETA: 0s - loss: 0.0094 - acc: 0.9973
    47360/60000 [======================>.......] - ETA: 0s - loss: 0.0096 - acc: 0.9972
    48384/60000 [=======================>......] - ETA: 0s - loss: 0.0097 - acc: 0.9971
    49664/60000 [=======================>......] - ETA: 0s - loss: 0.0098 - acc: 0.9971
    50944/60000 [========================>.....] - ETA: 0s - loss: 0.0097 - acc: 0.9971
    52096/60000 [=========================>....] - ETA: 0s - loss: 0.0097 - acc: 0.9971
    53504/60000 [=========================>....] - ETA: 0s - loss: 0.0098 - acc: 0.9971
    54784/60000 [==========================>...] - ETA: 0s - loss: 0.0099 - acc: 0.9971
    56064/60000 [===========================>..] - ETA: 0s - loss: 0.0099 - acc: 0.9971
    57472/60000 [===========================>..] - ETA: 0s - loss: 0.0100 - acc: 0.9970
    58752/60000 [============================>.] - ETA: 0s - loss: 0.0101 - acc: 0.9971
    59904/60000 [============================>.] - ETA: 0s - loss: 0.0100 - acc: 0.9971
    60000/60000 [==============================] - 3s 45us/step - loss: 0.0100 - acc: 0.9971
0 голосов
/ 25 июня 2019

Не могли бы вы предоставить полный ответ об ошибке? ясно дает вам массив с 3 измерениями (это может быть черно-белое изображение). Итак, один из вариантов - получить первые два двумерных массива (по одному массиву на канал).

Но вы должны предоставить больше информации, чтобы увидеть, что происходит тогда.

Удачи

...