Home > Blockchain >  ValueError: logits and labels must have the same shape, but got shapes [2] and [2,1]
ValueError: logits and labels must have the same shape, but got shapes [2] and [2,1]

Time:11-12

Please help me understand my mistake in TensorFlow.js code. Trying to beat binary classification and fitDataset.

Simplified example https://jsfiddle.net/9w8hx21o/4/.

In the example, I have 4 observations that are 4 by 7 and have four labels. At the beginning of training, I get the error "logits and labels must have the same shape, but got shapes [2] and [2,1]".

const xs = [
    [
        [1, 1, 1, 1, 1, 1, 1],
        [1, 1, 1, 1, 1, 1, 1],
        [1, 1, 1, 1, 1, 1, 1],
        [1, 1, 1, 1, 1, 1, 1],
    ],
    [
        [2, 2, 2, 2, 2, 2, 2],
        [2, 2, 2, 2, 2, 2, 2],
        [2, 2, 2, 2, 2, 2, 2],
        [2, 2, 2, 2, 2, 2, 2],
    ],
    [
        [3, 3, 3, 3, 3, 3, 3],
        [3, 3, 3, 3, 3, 3, 3],
        [3, 3, 3, 3, 3, 3, 3],
        [3, 3, 3, 3, 3, 3, 3],
    ],
    [
        [4, 4, 4, 4, 4, 4, 4],
        [4, 4, 4, 4, 4, 4, 4],
        [4, 4, 4, 4, 4, 4, 4],
        [4, 4, 4, 4, 4, 4, 4],
    ]
]

const ys = [0, 1, 0, 1]

const model = tf.sequential()
model.add(tf.layers.inputLayer({
    inputShape: [4, 7]
}))
model.add(tf.layers.conv1d({
    filters: 16,
    kernelSize: 2,
    activation: 'relu',
}))
model.add(tf.layers.flatten())
model.add(tf.layers.dense({
    units: 1,
    activation: 'sigmoid'
}))

model.summary()

model.compile({
    optimizer: 'adam',
    loss: 'binaryCrossentropy',
    metrics: ['accuracy']
})

const xDataset = tf.data.array(xs);
const yDataset = tf.data.array(ys);

const xyDataset = tf.data.zip({xs: xDataset, ys: yDataset}).batch(2).shuffle(2)

const print_xyDataset = async () => {
    await xyDataset.forEachAsync(e => {
        console.log('\n');
        for (let key in e) {
            console.log(key   ':');
            console.log('Shape '   e[key].shape)
            e[key].print();
        }
    })
}

print_xyDataset()

const train = async () => {
    await model.fitDataset(xyDataset, {
        epochs: 4,
        callbacks: {
            onEpochEnd: async (epoch, logs) => {
                console.log(`EPOCH (${epoch   1}): Train Accuracy: ${(logs.acc * 100).toFixed(2)}\n`);
            },
        }
    })
}

train().catch(e => console.log(e))

CodePudding user response:

You are probably running a new version of TF. The old TF would create mathematically equivalent but internally unexpected behavior if the true and pred were missing an extra dim. Do this

const ys = [[0], [1], [0], [1]]

and see if that fixes it.

  • Related