我有一个具有以下架构的预训练模型。
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 200) 0
_________________________________________________________________
embedding (Embedding) (None, 200, 300) 402877800
_________________________________________________________________
spatial_dropout1d (SpatialDr (None, 200, 300) 0
_________________________________________________________________
bidirectional (Bidirectional (None, 128) 186880
_________________________________________________________________
dropout (Dropout) (None, 128) 0
_________________________________________________________________
batch_normalization_v1 (Batc (None, 128) 512
_________________________________________________________________
dense (Dense) (None, 6) 774
_________________________________________________________________
reshape (Reshape) (None, 6) 0
=================================================================
Total params: 403,065,966
Trainable params: 187,910
Non-trainable params: 402,878,056
_________________________________________________________________
该Reshape
层确保logits与标签相同
我有一个输入数据,它是Tensor
s 形状为 (200,) 的张量流对象和标签,它们也是一个形状为 (6,) 的“张量”张量流对象。
我的目标evaluate
是使用tensorflow keras 提供的方法评估单个样本。为了更容易,我将Tensor
对象转换为 numpy. 由于模型接受 的输入形状[None, 200]
,因此我必须在将输入数据输入模型之前对其进行整形。这样,模型将具有形状为 (1,6) 的 logit,然后该Reshape
层将重新整形为 (6,)。
# Clone model is the keras model
# sample_data.x is the input
# sample_data.y is the label
clone_model.evaluate([sample_data.x.numpy().reshape(1,200)], [sample_data.y.numpy()])
但最后,我收到以下错误
InvalidArgumentError: logits and labels must have the same first dimension, got logits shape [1,6] and labels shape [6]
[[{{node loss_4/dense_loss/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits}}]] [Op:StatefulPartitionedCall]
为了克服这个问题,我移除了Reshape
图层并将标签重塑为 (1,6) ( sample_data.y.numpy().reshape(1,6)
)。但这并没有帮助,我最终遇到了同样的错误。
我想知道是否有人可以指出我在这里遗漏了什么?提前致谢。