Я видел два шаблона быстрого запуска в учебнике по TensorFlow 2.0.0 alpha .Один для новичка, который использует model.fit
для обучения, а другой - с tf.GradientTape() as tape:
и optimizer
.
Я пробовал оба кода для обучения одинаковых данных, той же сети, но только для новичкахорошо работает.
Ввод: [224, 224] картинки * 240
Сеть: VGG19 на приложении керас
Номер класса: 24
Модель:
vgg = VGG19(input_shape=(224, 224, 3), include_top=False, pooling='avg', weights='imagenet')
vgg.trainable = True
for layer in vgg.layers[:17]:
layer.trainable = False
model = models.Sequential([vgg, Dense(24, activation='softmax')])
Код для начинающих:
model.compile(optimizer=tf.keras.optimizers.RMSprop(lr=base_learning_rate), loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(x, y, epochs=total_epochs, batch_size=4)
Экспертный код:
loss_object = tf.keras.losses.CategoricalCrossentropy()
optimizer = tf.keras.optimizers.RMSprop(base_learning_rate)
train_loss = tf.keras.metrics.Mean(name='train_loss')
train_accuracy = tf.keras.metrics.CategoricalAccuracy(name='train_accuracy')
@tf.function
def train_step(images, labels):
with tf.GradientTape() as tape:
predictions = model(images)
loss = loss_object(labels, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
# gradients = [tf.clip_by_norm(g, 5) for g in gradients]
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
train_loss(loss)
train_accuracy(labels, predictions)
x, y = read_data('Images')
dataset = tf.data.Dataset.from_tensor_slices((x, y))
dataset = dataset.map(cast).repeat(total_epochs).shuffle(300).batch(4)
for epoch in range(total_epochs):
for image, label in dataset:
train_step(image, label)
template = 'Epoch {}, Loss: {}, Accuracy: {}'
print(template.format(epoch + 1,
train_loss.result(),
train_accuracy.result() * 100))
Результаты начинающих:
1/60 [..............................] - ETA: 14s - loss: 0.0631 - accuracy: 1.0000
2/60 [>.............................] - ETA: 14s - loss: 0.0440 - accuracy: 1.0000
3/60 [>.............................] - ETA: 13s - loss: 0.0340 - accuracy: 1.0000
4/60 [=>............................] - ETA: 13s - loss: 0.0274 - accuracy: 1.0000
5/60 [=>............................] - ETA: 13s - loss: 0.0307 - accuracy: 1.0000
6/60 [==>...........................] - ETA: 13s - loss: 0.0279 - accuracy: 1.0000
7/60 [==>...........................] - ETA: 13s - loss: 0.0261 - accuracy: 1.0000
8/60 [===>..........................] - ETA: 12s - loss: 0.0315 - accuracy: 1.0000
9/60 [===>..........................] - ETA: 12s - loss: 0.0293 - accuracy: 1.0000
10/60 [====>.........................] - ETA: 12s - loss: 0.0272 - accuracy: 1.0000
11/60 [====>.........................] - ETA: 12s - loss: 0.0300 - accuracy: 1.0000
12/60 [=====>........................] - ETA: 11s - loss: 0.0446 - accuracy: 1.0000
13/60 [=====>........................] - ETA: 11s - loss: 0.0416 - accuracy: 1.0000
14/60 [======>.......................] - ETA: 11s - loss: 0.0440 - accuracy: 1.0000
15/60 [======>.......................] - ETA: 11s - loss: 0.0439 - accuracy: 1.0000
Результаты экспертов:
Epoch 1, Loss: 4.321621, Accuracy: 3.225641
Epoch 2, Loss: 3.671219, Accuracy: 4.160391
Epoch 3, Loss: 3.451912, Accuracy: 3.561214