Нет batch_size при выводе из модели BERT - PullRequest
5 голосов
/ 02 июля 2019

Я работаю над проблемой двоичной классификации с моделью языка Tensorflow BERT. Вот ссылка на Google Colab. После сохранения и загрузки модели я получаю сообщение об ошибке при выполнении прогноза.

Сохранение модели

def serving_input_receiver_fn():
  feature_spec = {
      "input_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "input_mask" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "segment_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "label_ids" :  tf.FixedLenFeature([], tf.int64)
  }
  serialized_tf_example = tf.placeholder(dtype=tf.string,
                                         shape=[None],
                                         name='input_example_tensor')
  print(serialized_tf_example.shape)
  receiver_tensors = {'example': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

export_path = '/content/drive/My Drive/binary_class/bert/'
estimator._export_to_tpu = False  # this is important
estimator.export_saved_model(export_dir_base=export_path,serving_input_receiver_fn=serving_input_receiver_fn)

Прогнозирование на фиктивном тексте

pred_sentences = [
  "A novel, simple method to get insights from reviews"
]

def getPrediction1(in_sentences):
  labels = ["Irrelevant", "Relevant"]
  input_examples = [run_classifier.InputExample(guid="", text_a = x, text_b = None, label = 0) for x in in_sentences] # here, "" is just a dummy label
  input_features = run_classifier.convert_examples_to_features(input_examples, label_list, MAX_SEQ_LENGTH, tokenizer)
  predict_input_fn = run_classifier.input_fn_builder(features=input_features, seq_length=MAX_SEQ_LENGTH, is_training=False, drop_remainder=False)
  predictions = est.predict(predict_input_fn)
  print(predictions)
  return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]

est = tf.contrib.estimator.SavedModelEstimator(MODEL_FILE_PATH)
predictions = getPrediction1(pred_sentences[0])
predictions

Ошибка

W0702 05:44:17.551325 139812812932992 estimator.py:1811] Using temporary folder as model directory: /tmp/tmpzeiaa6q8
W0702 05:44:17.605536 139812812932992 saved_model_estimator.py:170] train mode not found in SavedModel.
W0702 05:44:17.608479 139812812932992 saved_model_estimator.py:170] eval mode not found in SavedModel.
<generator object Estimator.predict at 0x7f27fa721eb8>
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-28-56ea95428bf4> in <module>()
     21 # Relevant "Nanoparticulate drug delivery is a promising drug delivery system to a range of molecules to desired site specific action in the body. In this present work nanoparticles are prepared with positive group of amino group of chitosan with varying concentration based nanoparticles are loaded with anastrazole were prepared by with negative group of sodium tripolyphosphate by ionotropic gelation method. All these formulated nanoparticles are characterized for its particle size ,zeta potential ,drug entrapment efficacy and in-vitro release kinetics .The particle size of all these formulations were found to be 200,365,420,428 And 483.zeta potential of all formulations are-16.3±2.1 ,28.2±4.3,-10.38±3.6,-24.31±3.2 and 21.38±5.2.respectively. FT-IR studies indicated that there was no chemical interaction between drug and polymer and stability of drug. The in-vitro release behaviour from all the drug loaded batches was found to be zero order and provided sustained release over a period of 12 h by diffusion and swelling mechanism and The values of n and r 2 for coated batch was 0.731 and 0.979.Since the values of slope (n) lies in between 0.5 and 1 it was concluded that the mechanism by which drug is being released is a Non-Fickian anomalous solute diffusion mechanism, "
     22 
---> 23 predictions = getPrediction1(pred_sentences[0:2])
     24 predictions
     25 

5 frames
<ipython-input-28-56ea95428bf4> in getPrediction1(in_sentences)
     14   predictions = est.predict(predict_input_fn)
     15   print(predictions)
---> 16   return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]
     17 
     18 

<ipython-input-28-56ea95428bf4> in <listcomp>(.0)
     14   predictions = est.predict(predict_input_fn)
     15   print(predictions)
---> 16   return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]
     17 
     18 

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in predict(self, input_fn, predict_keys, hooks, checkpoint_path, yield_single_examples)
    615         self._create_and_assert_global_step(g)
    616         features, input_hooks = self._get_features_from_input_fn(
--> 617             input_fn, ModeKeys.PREDICT)
    618         estimator_spec = self._call_model_fn(
    619             features, None, ModeKeys.PREDICT, self.config)

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _get_features_from_input_fn(self, input_fn, mode)
    991   def _get_features_from_input_fn(self, input_fn, mode):
    992     """Extracts the `features` from return values of `input_fn`."""
--> 993     result = self._call_input_fn(input_fn, mode)
    994     result, _, hooks = estimator_util.parse_input_fn_result(result)
    995     self._validate_features_in_predict_input(result)

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _call_input_fn(self, input_fn, mode, input_context)
   1111       kwargs['input_context'] = input_context
   1112     with ops.device('/cpu:0'):
-> 1113       return input_fn(**kwargs)
   1114 
   1115   def _call_model_fn(self, features, labels, mode, config):

/usr/local/lib/python3.6/dist-packages/bert/run_classifier.py in input_fn(params)
    727   def input_fn(params):
    728     """The actual input function."""
--> 729     batch_size = params["batch_size"]
    730 
    731     num_examples = len(features)

KeyError: 'batch_size'

Параметр batch_size присутствует в оценщике, но отсутствует в параметрах загруженной модели.

estimator.params['batch_size'] # 32

est.params['batch_size'] # KeyError: 'batch_size'

1 Ответ

4 голосов
/ 08 июля 2019

Вы используете SavedModelEstimator, который не позволяет передавать аргументы RunConfig или params,

because the model function graph is defined statically in the SavedModel.

Поскольку SavedModelEstimator является подклассом Estimator, params является просто словарем, в котором хранятся гиперпараметры. Я думаю, что вы можете изменить params, передав ему нужную пару (ключ, значение) перед вызовом getPrediction1. Например:

est = tf.contrib.estimator.SavedModelEstimator(MODEL_FILE_PATH)
est.params['batch_size'] = 1
predictions = getPrediction1(pred_sentences)
Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...