0

我正在用 tf.data 测试 tf.keras,所以我可以进行小批量优化。我正在使用 MNIST 数据集,并且正在 Google Colab 中运行代码。但是,当我尝试训练网络时,总是会收到此错误: ValueError: Error when checking input: expected dense_18_input to have shape (784,) but got array with shape (1,). 这是我的代码:

import tensorflow as tf
from tensorflow.keras import layers
import numpy as np
import pandas as pd

!git clone https://github.com/DanorRon/my_repo
%cd my_repo
!ls

batch_size = 100
epochs = 10
alpha = 0.01
lambda_ = 0.01
h1 = 50

train = pd.read_csv('/content/sample_data/my_repo/mnist_train.csv.zip')
test = pd.read_csv('/content/sample_data/my_repo/mnist_test.csv.zip')

x_train = train.loc[:, '1x1':'28x28']
y_train = train.loc[:, 'label']

x_test = test.loc[:, '1x1':'28x28']
y_test = test.loc[:, 'label']

Train = tf.data.Dataset.from_tensor_slices((x_train, y_train))
Train.batch(batch_size).repeat(10).shuffle(1000)

model = tf.keras.Sequential()
model.add(layers.Dense(784, input_shape=(784,)))
model.add(layers.Dense(h1, activation='relu', kernel_regularizer=tf.keras.regularizers.l2(0.01)))
model.add(layers.Dense(10, activation='softmax', kernel_regularizer=tf.keras.regularizers.l2(0.01)))

model.compile(optimizer=tf.train.AdamOptimizer(alpha),
             loss = 'categorical_crossentropy',
             metrics = ['accuracy'])

model.fit(Train, epochs=epochs, steps_per_epoch=600)

我不知道问题是什么。我认为我的尺寸是正确的,我看不到任何其他问题。我该如何解决这个问题?

编辑:我查看了更多/测试的东西来找到答案,但我找不到任何有效的东西。我完全不知道问题可能是什么。

4

1 回答 1

0

我想也许你应该拆分x_trainy_trainfrom Train,并重写model.fitmodel.fit(x_train, y_train, epochs=epochs, steps_per_epoch=600).

于 2019-03-22T02:22:14.893 回答