Tensor has disposed

Issue

Most of the TensorFlow JS examples uses pre-trained model in the browser for prediction.

Here I am trying to create my own data in the browser and train the model with it.

Finally I have reached a point where I think I am close to creating a dataset which can be feed to a model.fit() to train the model

I have two arrays, one containing images and other containing labels.

let images = []
let labels = []

I am using a method to capture images from the canvas and push it in the images array.

function getImage() {
        return tf.tidy(() => {
            const image = tf.browser.fromPixels($('#mycanvas')[0]);
            const batchedImage = image.expandDims(0);
            const norm = batchedImage.toFloat().div(tf.scalar(255)).sub(tf.scalar(1));
            return norm;
        });
    }

So I push an image and a label to the arrays whenever I press any of the arrow key

let collectData = (label) =>{
    tf.tidy(() => {
        const img = getImage();
        img.print() // check if it is a tensor
        //imges.concat(img)
        images.push(img)
        labels.push(label) // labels are 0,1,2
     })
 }

After creating the arrays with training dataset, I pass these into model.fit() method to start the training.

let fitModel = async () => { 
        let imageSet = tf.stack(images);
        let labelSet = tf.oneHot(tf.tensor1d(lables, 'int32'), 3); 

        if (currentModel == null) {
            currentModel = createModel();
            currentModel.summary();
        }

        await currentModel.fit(imageSet, labelSet, {
            batchSize: 4,
            epochs: 20,
            shuffle: true,
            validationSplit: 0.1,
            callbacks: {
                onTrainBegin: () => console.log("Training Start"),
                onTrainEnd: () => console.log("Traing End"),
                onBatchEnd: async (num, log) => {
                    await tf.nextFrame();
                    console.log(log)
                }
            }
        })
    }

Here tf.stack(images); throws error stating – Tensor has disposed. I don’t understand why that’s happening. Technically this should work as stated in the official document.

const a = tf.tensor1d([1, 2]);
const b = tf.tensor1d([3, 4]);
const c = tf.tensor1d([5, 6]);
let p = []
p.push(a)
p.push(b)
p.push(c)
console.log(p)
tf.stack(p).print();

So I tried another thing – tf.stack(tf.tensor(images)) and I get error Tensor has disposed.

Another thing I though would give a try was – tf.concat() which also didn’t work. Anyone has any idea how can I create trainable dataset like this.

My first hidden layer has conv2d with inputShape(150,300,3) and output layer has unit:3.

Any help would be really appreciated. Please explain your answers for better understanding.

Solution

The tensors are disposed because you call tf.tidy before using them. To use the tensor img and clean all intermediates tensors, it has to be explicitly returned from the callback of tf.tidy

const img = tf.tidy(() => {
        const im = getImage();
        return im;
        // since im is returned, it will not be disposed;
        // but all unused intermediate tensors will be cleaned up
     })

images.push(img)
labels.push(label) // labels are 0,1,2

Answered By – edkeveked

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published