代码之家  ›  专栏  ›  技术社区  ›  Whoami

在keras中,将神经网络传递网络构建为prameter不起作用

  •  0
  • Whoami  · 技术社区  · 6 年前

    model = Sequential()
    model.add(Dense(64, activation='relu', input_shape=(784,) ))
    model.add(Dense(128, activation='relu'))
    model.add(Dense(784, activation='relu'))
    model.compile(optimizer='adam', loss='mean_squared_error')
    

    它毫无问题地工作。但是如果通过将上一层作为参数传递给下一层来实现这一点,那么我就会得到错误。

    layer1 = Dense(64, activation='relu', input_shape=(784,) )
    layer2 = Dense(128, activation='relu') (layer1)
    layer3 = Dense(784, activation='relu') (layer2)
    model = Model(layer1, layer3)
    model.compile(optimizer='adam', loss='mean_squared_error')
    

    下面是错误

    ValueError: Layer dense_2 was called with an input that isn't a symbolic tensor. Received type: <class 'keras.layers.core.Dense'>. Full input: [<keras.layers.core.Dense object at 0x7f1317396310>]. All inputs to the layer should be tensors.
    

    我怎样才能解决这个问题?

    1 回复  |  直到 6 年前
        1
  •  1
  •   Kota Mori    6 年前

    Input

    x = Input((784,))
    layer1 = Dense(64, activation='relu')(x)
    layer2 = Dense(128, activation='relu') (layer1)
    layer3 = Dense(784, activation='relu') (layer2)
    model = Model(inputs=x, outputs=layer3)
    model.compile(optimizer='adam', loss='mean_squared_error')