-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add projector plugin colab #3423
Conversation
Adds a colab demo demonstrating how to use the projector plugin locally.
Check out this pull request on You'll be able to see Jupyter notebook diff and discuss changes. Powered by ReviewNB. |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:18Z Should be 2020. hfiller commented on 2020-03-31T21:19:13Z Done |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:19Z The table here isn't rendering correctly. |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:20Z I think "highly dimensional embeddings" -> "high dimensional embeddings".
Also, "visuallizing" typo
An image here might be nice :). |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:20Z I don't think we need to say that we are using TF 2.
'tensorboard' -> 'TensorBoard' (also there is an extra '`')
It seems like we are not actually using the MNIST dataset below. |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:21Z Maybe start with "We will use a dataset of ..." just to make it read a bit more smoothly. |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:22Z Should we refer to it as just "embedding" or "embedding layer"? I think the layer is what provides the embeddings.
I think instead of "vector" we should explicitly say "embedding"
It might be worth linking to additional resources so people can learn about embeddings. E.g. https://www.tensorflow.org/tutorials/text/word_embeddings?hl=en |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:22Z "The layer we are embedding": This makes it seem like embedding is a verb we are applying to the layer. I think instead it is "The embedding layer". |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:24Z Can this be done by the Keras TensorBoard callback?
Is this the only way of saving embeddings? hfiller commented on 2020-03-30T08:07:13Z Currently yes, I believe the Keras TensorBoard callback is broken, I'm currently working on fixing this. |
View / edit / reply to this conversation on ReviewNB GalOshri commented on 2020-03-26T02:35:24Z We should add an image here otherwise the tutorial page won't have it (see how it is done as an example at https://github.com/tensorflow/tensorboard/blob/90227dcf7e7f900ba77b5b7ced1b14378a7ec98f/docs/hyperparameter_tuning_with_hparams.ipynb
I think it would be useful to add some explanation of why this is interesting/valuable. What does the user get from doing this? Is there some insight? hfiller commented on 2020-03-31T21:20:30Z Added some analysis including a comparison between Alfred Hitchcock and Wes Anderson. Let me know what you think |
Thank you for adding this! This tutorial will be super valuable for helping people learn about and get started with the projector plugin. Can you also add it to https://github.com/tensorflow/tensorboard/blob/master/docs/_book.yaml? This enables it to show up in the left pane on the docs website. Let's put it under the hyperparameter tuning tutorial. |
@@ -0,0 +1,337 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Tensorboard" -> "TensorBoard"
"from your tensorflow projects from the logs in the specified directory log_dir
" is a bit confusing. Perhaps something like "from the logs in the specifiedlog_dir
directory "?
Reply via ReviewNB
@@ -0,0 +1,337 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be worth including a sentence on how you identified the more closely associated words (search for a particular word, the neighbors are highlighted. These are words that are close to it in the embedding space).
Reply via ReviewNB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few other comments left through ReviewNB but looks great once those are resolved!
Thank you for taking the time to do this and also for finding a great example.
docs/_book.yaml
Outdated
@@ -23,6 +23,8 @@ upper_tabs: | |||
path: /tensorboard/graphs | |||
- title: "Hyperparameter tuning" | |||
path: /tensorboard/hyperparameter_tuning_with_hparams | |||
- title: "Projector plugin" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want to avoid using "plugin" in user documentation. How about "Embedding projector" or "Projector dashboard"?
* Add projector plugin colab * Adds a colab demo demonstrating how to use the projector plugin locally. * Add embedding_projector image for colab * Add colab to _book.yaml * Add analysis to projector plugin * Adding images for embedding projector colab * Rename Projector plugin to Embedding projector
* Add projector plugin colab * Adds a colab demo demonstrating how to use the projector plugin locally. * Add embedding_projector image for colab * Add colab to _book.yaml * Add analysis to projector plugin * Adding images for embedding projector colab * Rename Projector plugin to Embedding projector
Adds a colab demo demonstrating how to use the projector plugin locally.