Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion keras_nlp/layers/token_and_position_embedding.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,9 @@

@keras.utils.register_keras_serializable(package="keras_nlp")
class TokenAndPositionEmbedding(keras.layers.Layer):
"""A layer which sums a token and position embedding.
"""Token and position embeddings are ways of representing words and their order in a sentence
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first line of a docstring should always be a single sentence fragment <80 characters. Let's leave this unchanged

We could add a description to the next paragraph though! Maybe....

Token and position embeddings give a dense representation of the
input tokens, and their position in the sequence respectively. This
layer create a `keras.layers.Embedding` token embedding and a
`keras_nlp.layers.PositionEmbedding` position embedding and sums
their output when called. This layer assumes that the last dimension in
the input corresponds to the sequence dimension.

so that a machine learning model like a neural network can understand them.
This layer sums a token and position embedding.
This layer assumes that the last dimension in the input corresponds
to the sequence dimension.
Expand Down