Я играю с tenorflow 2. Я сделал свою собственную модель, аналогичную тому, как это делается здесь .
Затем я создал свою собственную функцию подгонки. Теперь у меня самая странная вещь. Вот Точный вывод копии / вставки из моей записной книжки, где я проводил тесты:
def fit(x_train, y_train, learning_rate=0.01, epochs=10, batch_size=100, normal=True, verbose=True, display_freq=100):
if normal:
x_train = normalize(x_train) # TODO: This normalize could be a bit different for each and be bad.
num_tr_iter = int(len(y_train) / batch_size) # Number of training iterations in each epoch
if verbose:
print("Starting training...")
for epoch in range(epochs):
# Randomly shuffle the training data at the beginning of each epoch
x_train, y_train = randomize(x_train, y_train)
for iteration in range(num_tr_iter):
# Get the batch
start = iteration * batch_size
end = (iteration + 1) * batch_size
x_batch, y_batch = get_next_batch(x_train, y_train, start, end)
# Run optimization op (backpropagation)
# import pdb; pdb.set_trace()
if verbose and (epoch * batch_size + iteration) % display_freq == 0:
current_loss = _apply_loss(y_train, model(x_train, training=True))
current_acc = evaluate_accuracy(x_train, y_train)
print("Epoch: {0}/{1}; batch {2}/{3}; loss: {4:.4f}; accuracy: {5:.2f} %"
.format(epoch, epochs, iteration, num_tr_iter, current_loss, current_acc*100))
train_step(x_batch, y_batch, learning_rate)
current_loss = _apply_loss(y_train, model(x_train, training=True))
current_acc = evaluate_accuracy(x_train, y_train)
print("End: loss: {0:.4f}; accuracy: {1:.2f} %".format(current_loss, current_acc*100))
import logging
logging.getLogger('tensorflow').disabled = True
fit(x_train, y_train)
current_loss = _apply_loss(y_train, model(x_train, training=True))
current_acc = evaluate_accuracy(x_train, y_train)
print("End: loss: {0:.4f}; accuracy: {1:.2f} %".format(current_loss, current_acc*100))
Этот сегмент выводит:
Starting training...
Epoch: 0/10; batch 0/80; loss: 0.9533; accuracy: 59.67 %
Epoch: 1/10; batch 0/80; loss: 0.9386; accuracy: 60.15 %
Epoch: 2/10; batch 0/80; loss: 0.9259; accuracy: 60.50 %
Epoch: 3/10; batch 0/80; loss: 0.9148; accuracy: 61.05 %
Epoch: 4/10; batch 0/80; loss: 0.9051; accuracy: 61.15 %
Epoch: 5/10; batch 0/80; loss: 0.8968; accuracy: 61.35 %
Epoch: 6/10; batch 0/80; loss: 0.8896; accuracy: 61.27 %
Epoch: 7/10; batch 0/80; loss: 0.8833; accuracy: 61.51 %
Epoch: 8/10; batch 0/80; loss: 0.8780; accuracy: 61.52 %
Epoch: 9/10; batch 0/80; loss: 0.8733; accuracy: 61.54 %
End: loss: 0.8733; accuracy: 61.54 %
End: loss: 0.4671; accuracy: 77.08 %
Теперь мой вопрос, как получается, что я получаю другое значение на последних 2 строках !? Я делаю то же самое правильно? Я полностью озадачен здесь. Я даже не знаю, как это Google.