Skip to content

Conversation

@Neeshamraghav012
Copy link
Contributor

I have added clear information regarding what the layer does. It was not specified earlier, because of it user gets in trouble about what it does and when to use it.

I have added clear information regarding what the layer does. It was not specified earlier, because of it user gets in trouble about what it does and when to use it.
@keras.utils.register_keras_serializable(package="keras_nlp")
class TokenAndPositionEmbedding(keras.layers.Layer):
"""A layer which sums a token and position embedding.
"""Token and position embeddings are ways of representing words and their order in a sentence
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first line of a docstring should always be a single sentence fragment <80 characters. Let's leave this unchanged

We could add a description to the next paragraph though! Maybe....

Token and position embeddings give a dense representation of the
input tokens, and their position in the sequence respectively. This
layer create a `keras.layers.Embedding` token embedding and a
`keras_nlp.layers.PositionEmbedding` position embedding and sums
their output when called. This layer assumes that the last dimension in
the input corresponds to the sequence dimension.

Updated the docstring as suggested by the reviewer.
class TokenAndPositionEmbedding(keras.layers.Layer):
"""A layer which sums a token and position embedding.
Token and position embeddings are ways of representing words and their order in a sentence
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Add an newline between the docstring summary (the first line), and this one
  • Let's delete "so that a machine learning model like a neural network can understand them" that feels a little under technical.
  • Please format this so line lengths are <80 characters.

Token and position embeddings are ways of representing words and their order in a sentence
so that a machine learning model like a neural network can understand them. This
layer create a `keras.layers.Embedding` token embedding and a
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create -> creates

@mattdangerw
Copy link
Member

Thanks!

@mattdangerw mattdangerw merged commit ee99548 into keras-team:master Feb 16, 2023
mattdangerw added a commit to mattdangerw/keras-hub that referenced this pull request Feb 16, 2023
@mattdangerw mattdangerw mentioned this pull request Feb 16, 2023
mattdangerw added a commit that referenced this pull request Feb 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants