Пожалуйста, посмотрите этот пример из пользовательского раздела слоя на , как использовать блок, т. Е. Вывод слоя в другие вежливые учебники по тензорному потоку.
мы создали класс, который сам по себе ничто но также добавлен блок модели re snet, в конце которого вы можете увидеть функцию идентификации.
Models: Composing layers
Many interesting layer-like things in machine learning models are implemented by composing existing layers. For example, each residual block in a resnet is a composition of convolutions, batch normalizations, and a shortcut. Layers can be nested inside other layers.
Typically you inherit from keras.Model when you need the model methods like: Model.fit,Model.evaluate, and Model.save (see Custom Keras layers and models for details).
One other feature provided by keras.Model (instead of keras.layers.Layer) is that in addition to tracking variables, a keras.Model also tracks its internal layers, making them easier to inspect.
class ResnetIdentityBlock(tf.keras.Model):
def __init__(self, kernel_size, filters):
super(ResnetIdentityBlock, self).__init__(name='')
filters1, filters2, filters3 = filters
self.conv2a = tf.keras.layers.Conv2D(filters1, (1, 1))
self.bn2a = tf.keras.layers.BatchNormalization()
self.conv2b = tf.keras.layers.Conv2D(filters2, kernel_size, padding='same')
self.bn2b = tf.keras.layers.BatchNormalization()
self.conv2c = tf.keras.layers.Conv2D(filters3, (1, 1))
self.bn2c = tf.keras.layers.BatchNormalization()
def call(self, input_tensor, training=False):
x = self.conv2a(input_tensor)
x = self.bn2a(x, training=training)
x = tf.nn.relu(x)
x = self.conv2b(x)
x = self.bn2b(x, training=training)
x = tf.nn.relu(x)
x = self.conv2c(x)
x = self.bn2c(x, training=training)
x += input_tensor
return tf.nn.relu(x)
object of resnet class
block = ResnetIdentityBlock(1, [1, 2, 3])
def chain_blocks(input):
x1 = block(input)
x2 = block(x1)
....
....
return xn
so on you can bind up the resnet flow in sequential way within a function., moreover if you want to add another layer after a block you can do that, just make sure output shape of block should be same as input shape of next layer.
Дайте мне знать, если вам нужна дополнительная информация.