Как объединить вложения с входами переменной длины в Keras? - PullRequest
0 голосов
/ 09 октября 2019

Вот схема сети, над которой я работаю, и данные представлены в виде таблицы и структуры:

enter image description here

Слева у нас есть способности , которые являются непрерывными функциями и справа, мы могли бы иметь 'N' количество модификаторов . Каждый модификатор имеет modifier_type , который является категоричным, и некоторую статистику , которая является непрерывным признаком.

Если это был только один модификатор , вот код, которыйработает просто отлично!

import keras.backend as K
from keras.models import Model
from keras.layers import Input, Embedding, concatenate
from keras.layers import Dense, GlobalMaxPooling1D, Reshape
from keras.optimizers import Adam

K.clear_session()

# Using embeddings for categorical features
modifier_type_embedding_in=[]
modifier_type_embedding_out=[]

# sample categorical features
categorical_features = ['modifier_type']

modifier_input_ = Input(shape=(1,), name='modifier_type_in')
# Let's assume 10 unique type of modifiers and let's have embedding dimension as 6
modifier_output_ = Embedding(input_dim=10, output_dim=6, name='modifier_type')(modifier_input_)
modifier_output_ = Reshape(target_shape=(6,))(modifier_output_)  

modifier_type_embedding_in.append(modifier_input_)
modifier_type_embedding_out.append(modifier_output_)

# sample continuous features
statistics = ['duration']
statistics_inputs =[Input(shape=(len(statistics),), name='statistics')] # Input(shape=(1,))

# sample continuous features
abilities = ['buyback_cost', 'cooldown', 'number_of_deaths', 'ability', 'teleport', 'team', 'level', 'max_mana', 'intelligence']
abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))

concat = concatenate(modifier_type_embedding_out + statistics_inputs)
FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
model = concatenate(abilities_inputs + [FC_relu])
model = Dense(64, activation='relu', name='fc_relu_3')(model)
model_out = Dense(1, activation='sigmoid', name='fc_sigmoid')(model)

model_in = abilities_inputs + modifier_type_embedding_in + statistics_inputs
model = Model(inputs=model_in, outputs=model_out)
model.compile(loss='binary_crossentropy', optimizer=Adam(lr=2e-05, decay=1e-3), metrics=['accuracy'])

enter image description here

Однако при компиляции числа модификаторов 'N' я получаю ошибку ниже, а ниже вносимые измененияв коде

modifier_input_ = Input(shape=(None, 1,), name='modifier_type_in')


statistics_inputs =[Input(shape=(None, len(statistics),), name='statistics')] # Input(shape=(None, 1,))


FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
max_pool = GlobalMaxPooling1D()(FC_relu)

model = concatenate(abilities_inputs + [max_pool])

Вот что я получаю,

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-3-7703088b1d24> in <module>
     22 abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))
     23 
---> 24 concat = concatenate(modifier_type_embedding_out + statistics_inputs)
     25 FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
     26 FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)

e:\Miniconda3\lib\site-packages\keras\layers\merge.py in concatenate(inputs, axis, **kwargs)
    647         A tensor, the concatenation of the inputs alongside axis `axis`.
    648     """
--> 649     return Concatenate(axis=axis, **kwargs)(inputs)
    650 
    651 

e:\Miniconda3\lib\site-packages\keras\engine\base_layer.py in __call__(self, inputs, **kwargs)
    423                                          'You can build it manually via: '
    424                                          '`layer.build(batch_input_shape)`')
--> 425                 self.build(unpack_singleton(input_shapes))
    426                 self.built = True
    427 

e:\Miniconda3\lib\site-packages\keras\layers\merge.py in build(self, input_shape)
    360                              'inputs with matching shapes '
    361                              'except for the concat axis. '
--> 362                              'Got inputs shapes: %s' % (input_shape))
    363 
    364     def _merge_function(self, inputs):

ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 6), (None, None, 1)]

Как использовать слой внедрения в нейронной сети, предназначенный для приема функций переменной входной длины?

1 Ответ

0 голосов
/ 18 октября 2019

Ответ:

import keras.backend as K
from keras.models import Model
from keras.layers import Input, Embedding, concatenate
from keras.layers import Dense, GlobalMaxPooling1D, Reshape
from keras.optimizers import Adam

K.clear_session()

# Using embeddings for categorical features
modifier_type_embedding_in=[]
modifier_type_embedding_out=[]

# sample categorical features
categorical_features = ['modifier_type']

modifier_input_ = Input(shape=(None,), name='modifier_type_in')
# Let's assume 10 unique type of modifiers and let's have embedding dimension as 6
modifier_output_ = Embedding(input_dim=10, output_dim=6, name='modifier_type')(modifier_input_)

modifier_type_embedding_in.append(modifier_input_)
modifier_type_embedding_out.append(modifier_output_)

# sample continuous features
statistics = ['duration']
statistics_inputs =[Input(shape=(None, len(statistics),), name='statistics')] # Input(shape=(1,))

# sample continuous features
abilities = ['buyback_cost', 'cooldown', 'number_of_deaths', 'ability', 'teleport', 'team', 'level', 'max_mana', 'intelligence']
abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))

concat = concatenate(modifier_type_embedding_out + statistics_inputs)
FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
max_pool = GlobalMaxPooling1D()(FC_relu)

model = concatenate(abilities_inputs + [max_pool])
model = Dense(64, activation='relu', name='fc_relu_3')(model)
model_out = Dense(1, activation='sigmoid', name='fc_sigmoid')(model)

model_in = abilities_inputs + modifier_type_embedding_in + statistics_inputs
model = Model(inputs=model_in, outputs=model_out)
model.compile(loss='binary_crossentropy', optimizer=Adam(lr=2e-05, decay=1e-3), metrics=['accuracy'])

enter image description here

...