I would like to visualize my data based on multiple tensor variables, that is, based on different embedding variables. In other words what I need to do is the following:
I need to store the 100 dimensional vector (image feature/embeddings) into 5 different variables. Then I need to visualize my data based on the 5 different variables. That is, I need to visualize my data based on the first 20 features, and based on the second 20 features and so on…
While I was looking into the embedding visualization tutorial on https://www.tensorflow.org/get_started/embedding_viz, they say that we can add multiple embeddings. This is what I am looking for.
How to do this in tensorflow?
Any help is much appreciated!!
So the reason it didn’t work is because I was trying to divide the 100 dimensional embedding into 100 different variables. And that did not work. So when I divided my embeddings into 5 different parts, that is, dividing them into 5 different variables, it worked out. Below is my code:
import numpy as np import tensorflow as tf from tensorflow.contrib.tensorboard.plugins import projector LOG_DIR = \ 'C:/Users/user/PycharmProjects/VariationalAutoEncoder/' \ 'Tensorflow-DeconvNet-Segmentation/Embeddings/features_images.ckpt' feature_vectors = np.loadtxt('features.txt') feature_vectors = feature_vectors[:5329] print("feature_vectors_shape:",feature_vectors.shape) sub_features =  for i in range(20): features = tf.Variable(feature_vectors[:, 5 * i: 5 * (i + 1)], name=('features' + str(i))) sub_features.append(features) with tf.Session() as sess: saver = tf.train.Saver() sess.run(tf.global_variables_initializer()) saver.save(sess, LOG_DIR) config = projector.ProjectorConfig() for i in range(20): embedding = config.embeddings.add() embedding.tensor_name = sub_features[i].name embedding.sprite.image_path = \ 'C:/Users/user/PycharmProjects/VariationalAutoEncoder/Tensorflow-DeconvNet-Segmentation/master.jpg' embedding.sprite.single_image_dim.extend([112, 112]) # Saves a config file that TensorBoard will read during startup. projector.visualize_embeddings(tf.summary.FileWriter(LOG_DIR), config)
Answered By – I. A