Проблемы, возникшие при получении массива активации среднего уровня из предварительно обученной модели с использованием Keras и Theano? - PullRequest
1 голос
/ 18 октября 2019

Может ли кто-нибудь помочь мне решить следующую проблему?

Я создал точную настройку и заморозил все слои, кроме слоя [2], потому что я хочу получить значения активации только из слоя [2].

Среда: Керас == 1.1.0 Theano == 1.0.2 numpy == 1.15.1 scipy == 1.3.0

Может ли кто-нибудь помочь мне решить следующую проблему?

Среда: Keras == 1.1.0 Theano == 1.0.2 numpy == 1.15.1 scipy == 1.3.0

Я создал точную настройку и заморозил все слои, кроме слоя [2] потому что я хочу получить значения активации только из слоя [2].

Сводная информация о сети перед замораживанием:

> Layer (type)                    |  Output Shape      |    Param #  |    Connected to                     
____________________________________________________________________________________________________
> dense_1 (Dense)                  (None, 512)           2097664     dense_input_1[0][0]              
____________________________________________________________________________________________________
> dropout_1 (Dropout)              (None, 512)           0           dense_1[0][0]                    
                                                                   dense_1[0][0]                    
____________________________________________________________________________________________________
> dense_2 (Dense)                  (None, 32)            16416       dropout_1[0][0]                  
                                                                   dropout_1[1][0]                  
____________________________________________________________________________________________________
> dropout_2 (Dropout)              (None, 32)            0           dense_2[0][0]                    
                                                                   dense_2[1][0]                    
____________________________________________________________________________________________________
> dense_3 (Dense)                  (None, 1)             33          dropout_2[0][0]                  
                                                                   dropout_2[1][0]                  
____________________________________________________________________________________________________
>Total params: 2114113

Замораживание слоев:

for layer in model.layers[0:]: 
     layer.trainable = False 
model.layers[2].trainable = True 

СетьСводка после замораживания:

>`Layer (type)                    |  Output Shape      |    Param #  |    Connected to                     
____________________________________________________________________________________________________
>dense_1 (Dense)                  (None, 512)           0           dense_input_1[0][0]              
____________________________________________________________________________________________________
>dropout_1 (Dropout)              (None, 512)           0           dense_1[0][0]                    
____________________________________________________________________________________________________
>dense_2 (Dense)                  (None, 32)            16416       dropout_1[1][0]                  
____________________________________________________________________________________________________
>dropout_2 (Dropout)              (None, 32)            0           dense_2[1][0]                    
____________________________________________________________________________________________________
>dense_3 (Dense)                  (None, 1)             0           dropout_2[1][0]                  
____________________________________________________________________________________________________
>Total params: 16416`

Чтобы напечатать вывод слоя [2]:

OutFunc = keras.backend.function([model2.input], [model2.layers[2].get_output_at(0)])
out_val = OutFunc([inputs])[0]
print(out_val)

Возвращает следующую ошибку вывода:

> MissingInputError                         Traceback (most recent call last)
<ipython-input-203-23bb284b98f3> in <module>
      1 #OutFunc = keras.backend.function([model2.input], [model2.layers[0].output])
----> 2 OutFunc = keras.backend.function([model2.input], [model2.layers[2].get_output_at(0)])
      3 
      4 
      5 out_val = OutFunc([inputs])[0]

> ~/anaconda3/lib/python3.7/site-packages/keras/backend/theano_backend.py in function(inputs, outputs, updates, **kwargs)
    725     return T.clip(x, min_value, max_value)
    726 
--> 727 
    728 def equal(x, y):
    729     return T.eq(x, y)

> ~/anaconda3/lib/python3.7/site-packages/keras/backend/theano_backend.py in __init__(self, inputs, outputs, updates, **kwargs)
    711 
    712 def pow(x, a):
--> 713     return T.pow(x, a)
    714 
    715 

> ~/anaconda3/lib/python3.7/site-packages/theano/compile/function.py in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
    315                    on_unused_input=on_unused_input,
    316                    profile=profile,
--> 317                    output_keys=output_keys)
    318     return fn

> ~/anaconda3/lib/python3.7/site-packages/theano/compile/pfunc.py in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
    484                          accept_inplace=accept_inplace, name=name,
    485                          profile=profile, on_unused_input=on_unused_input,
--> 486                          output_keys=output_keys)
    487 
    488 

> ~/anaconda3/lib/python3.7/site-packages/theano/compile/function_module.py in orig_function(inputs, outputs, mode, accept_inplace, name, profile, on_unused_input, output_keys)
   1837                   on_unused_input=on_unused_input,
   1838                   output_keys=output_keys,
-> 1839                   name=name)
   1840         with theano.change_flags(compute_test_value="off"):
   1841             fn = m.create(defaults)

> ~/anaconda3/lib/python3.7/site-packages/theano/compile/function_module.py in __init__(self, inputs, outputs, mode, accept_inplace, function_builder, profile, on_unused_input, fgraph, output_keys, name)
   1485             # OUTPUT VARIABLES)
   1486             fgraph, additional_outputs = std_fgraph(inputs, outputs,
-> 1487                                                     accept_inplace)
   1488             fgraph.profile = profile
   1489         else:

> ~/anaconda3/lib/python3.7/site-packages/theano/compile/function_module.py in std_fgraph(input_specs, output_specs, accept_inplace)
    179 
    180     fgraph = gof.fg.FunctionGraph(orig_inputs, orig_outputs,
--> 181                                   update_mapping=update_mapping)
    182 
    183     for node in fgraph.apply_nodes:

> ~/anaconda3/lib/python3.7/site-packages/theano/gof/fg.py in __init__(self, inputs, outputs, features, clone, update_mapping)
    173 
    174         for output in outputs:
--> 175             self.__import_r__(output, reason="init")
    176         for i, output in enumerate(outputs):
    177             output.clients.append(('output', i))

> ~/anaconda3/lib/python3.7/site-packages/theano/gof/fg.py in __import_r__(self, variable, reason)
    344         # Imports the owners of the variables
    345         if variable.owner and variable.owner not in self.apply_nodes:
--> 346                 self.__import__(variable.owner, reason=reason)
    347         elif (variable.owner is None and
    348                 not isinstance(variable, graph.Constant) and

> ~/anaconda3/lib/python3.7/site-packages/theano/gof/fg.py in __import__(self, apply_node, check, reason)
    389                                      "for more information on this error."
    390                                      % (node.inputs.index(r), str(node)))
--> 391                         raise MissingInputError(error_msg, variable=r)
    392 
    393         for node in new_nodes:

> MissingInputError: Input 0 of the graph (indices start from 0), used to compute InplaceDimShuffle{x,x}(keras_learning_phase), was not provided and not given a value. Use the Theano flag exception_verbosity='high', for more information on this error.

> Backtrace when that variable is created:

>   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jayne/anaconda3/lib/python3.7/site-packages/keras/backend/__init__.py", line 61, in <module>
    from .theano_backend import *
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jayne/anaconda3/lib/python3.7/site-packages/keras/backend/theano_backend.py", line 23, in <module>
    _LEARNING_PHASE = T.scalar(dtype='uint8', name='keras_learning_phase')  # 0 = test, 1 = train

...