Я хочу экспортировать стандартную модель оценщика для обслуживания с тензорным обслуживанием, но я вижу пример, что мы должны объявить serve_input_receiver_fn .Пример следующий:
feature_spec = {'foo': tf.FixedLenFeature(...),
'bar': tf.VarLenFeature(...)}
def serving_input_receiver_fn():
"""An input receiver that expects a serialized tf.Example."""
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[default_batch_size],
name='input_example_tensor')
receiver_tensors = {'examples': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
Я последовал этому примеру, и мой код такой:
feature_spec = {'user_id': tf.FixedLenFeature([1], tf.int64),
'item_id': tf.FixedLenFeature([1], tf.int64),}
def serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(dtype=tf.string, shape=[None], name='input_tensors')
receiver_tensors = {'inputs': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
, и моя модель выглядит так:
def model_fn(features, labels, mode, params):
user_id = features["user_id"]
item_id = features["item_id"]
ошибка в том, что:
InvalidArgumentError (see above for traceback): Restoring from checkpoint failed. This is most likely due to a mismatch between the current graph and the graph from the checkpoint. Please ensure that you have not altered the graph expected based on the checkpoint. Original error:
Assign requires shapes of both tensors to match. lhs shape= [64,128] rhs shape= [128,128]
[[node save/Assign_5 (defined at estimator.py:142) = Assign[T=DT_FLOAT, _class=["loc:@dense/kernel"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](dense/kernel, save/RestoreV2:5)]]