You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
i see there is ongoing work there. I tried your code with tensorflow 1.12 and there are warnings about using tf.data instead and tf.nn.softmax_cross_entropy_with_logits. I got this change, which does the most less change in the old code:
At the beginning we change the way we load the data:
#mnist_data = mnist.input_data.read_data_sets(
# FLAGS.data_dir, one_hot=True, fake_data=FLAGS.fake_data)
train, test = tf.keras.datasets.mnist.load_data()
mnist_x, mnist_y = train
test_x, test_y = test
#creating a tf.data.Dataset and do some transformations on it
mnist_x_ds = tf.data.Dataset.from_tensor_slices(mnist_x)
mnist_x_ds = mnist_x_ds.map(lambda x : tf.cast(x, tf.float32))
mnist_x_ds = mnist_x_ds.map(lambda x : tf.divide(x, 255.0))
mnist_x_ds = mnist_x_ds.map(lambda x : tf.reshape(x, [784]))
#repeat means, if the end is reached it starts from beginning
#Then we do a .batch() because this adds a extra column which we need here
mnist_x_ds = mnist_x_ds.repeat()
xs1 = mnist_x_ds.batch(100)
xs1_it = xs1.make_one_shot_iterator()
xs1_n = xs1_it.get_next()
#creating a tf.data.Dataset. The same again with the labels.
#Normally a Dataset takes both (images and labels) into one Dataset to e.g. shuffle easily
#here we need both in separate Datasets
mnist_y_ds = tf.data.Dataset.from_tensor_slices(mnist_y)
mnist_y_ds = mnist_y_ds.map(lambda x : tf.one_hot(x, 10))
mnist_y_ds = mnist_y_ds.map(lambda x : tf.cast(x, tf.float64))
mnist_y_ds = mnist_y_ds.repeat()
ys1 = mnist_y_ds.batch(100)
ys1_it = ys1.make_one_shot_iterator()
ys1_n = ys1_it.get_next()
#Same here for the test-data-images
test_x_ds = tf.data.Dataset.from_tensor_slices(test_x)
test_x_ds = test_x_ds.map(lambda x : tf.cast(x, tf.float32))
test_x_ds = test_x_ds.map(lambda x : tf.divide(x, 255.0))
test_x_ds = test_x_ds.map(lambda x : tf.reshape(x, [784]))
#here you need to choose the complete dataset in .batch() to
#get a consistent accurracy value (if here less, its just for debugging reasons)
t_xs1 = test_x_ds.batch(1000)
t_xs1 = t_xs1.repeat()
t_xs1_it = t_xs1.make_one_shot_iterator()
t_xs1_n = t_xs1_it.get_next()
#and the same for the test-data-labels
test_y_ds = tf.data.Dataset.from_tensor_slices(test_y)
test_y_ds = test_y_ds.map(lambda x : tf.one_hot(x, 10))
test_y_ds = test_y_ds.map(lambda x : tf.cast(x, tf.float64))
t_ys1 = test_y_ds.batch(1000)
t_ys1 = t_ys1.repeat()
t_ys1_it = t_ys1.make_one_shot_iterator()
t_ys1_n = t_ys1_it.get_next()
in def feed_dict we get the values from the Datasets:
def feed_dict(is_train):
if is_train or FLAGS.fake_data:
#xs, ys = mnist_data.train.next_batch(100, fake_data=FLAGS.fake_data)
#we run sess.run() to get a numpy array, what we need as input for feed dict
#That example here is not perfect for tf.data i think, but perhaps there are
#other ways
xs = sess.run(xs1_n)
ys = sess.run(ys1_n)
k = FLAGS.dropout
time.sleep(1)
else:
#xs, ys = mnist_data.test.images, mnist_data.test.labels
xs = sess.run(t_xs1_n)
ys = sess.run(t_ys1_n)
k = 1.0
return {x: xs, y_: ys, keep_prob: k}
It works with tensorboard tag: 1.12.1
Its not the perfect usage of the Datasets, because using the images and labels inside one Dataset would made it possible to e.g. shuffle easily, but it uses tf.data and vanishes the warnings :)
The text was updated successfully, but these errors were encountered:
flobotics
changed the title
code for 1.12.1 to use tf.data
code for 1.12.1 to use tf.data in beholder_demo.py
Dec 22, 2018
Thank you for the report, and your work in discovering the root of these warnings! Unfortunately as of TensorBoard 2.4.0, the Beholder plugin code and its associated demo code have been removed. For more context on the removal, please see #3843.
Closing this issue since the relevant code is no longer in the repository.
Hi,
i see there is ongoing work there. I tried your code with tensorflow 1.12 and there are warnings about using tf.data instead and tf.nn.softmax_cross_entropy_with_logits. I got this change, which does the most less change in the old code:
At the beginning we change the way we load the data:
in def feed_dict we get the values from the Datasets:
def feed_dict(is_train):
if is_train or FLAGS.fake_data:
#xs, ys = mnist_data.train.next_batch(100, fake_data=FLAGS.fake_data)
It works with tensorboard tag: 1.12.1
Its not the perfect usage of the Datasets, because using the images and labels inside one Dataset would made it possible to e.g. shuffle easily, but it uses tf.data and vanishes the warnings :)
The text was updated successfully, but these errors were encountered: