Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion keras_hub/src/models/backbone.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,8 @@ def from_preset(
1. a built-in preset identifier like `'bert_base_en'`
2. a Kaggle Models handle like `'kaggle://user/bert/keras/bert_base_en'`
3. a Hugging Face handle like `'hf://user/bert_base_en'`
4. a path to a local preset directory like `'./bert_base_en'`
4. a ModelScope Face handle like `'modelscope://user/bert_base_en'`
5. a path to a local preset directory like `'./bert_base_en'`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

There's a small typo in the description for the ModelScope handle. It should be ModelScope handle instead of ModelScope Face handle for consistency with other parts of the codebase (e.g., keras_hub/src/utils/preset_utils.py) and to avoid confusion with Hugging Face.

Suggested change
4. a ModelScope Face handle like `'modelscope://user/bert_base_en'`
5. a path to a local preset directory like `'./bert_base_en'`
4. a ModelScope handle like `'modelscope://user/bert_base_en'`
5. a path to a local preset directory like `'./bert_base_en'`


This constructor can be called in one of two ways. Either from the base
class like `keras_hub.models.Backbone.from_preset()`, or from
Expand Down
4 changes: 3 additions & 1 deletion keras_hub/src/models/sam/sam_prompt_encoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,9 @@ def __init__(
self.activation = activation

self.positional_embedding_layer = RandomFrequencyPositionalEmbeddings(
num_positional_features=self.hidden_size // 2, scale=1
num_positional_features=self.hidden_size // 2,
scale=1,
dtype=self.dtype,
)
Comment on lines 69 to 73
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Passing dtype=self.dtype here is a good step towards supporting bfloat16 inference. However, to ensure full compatibility and prevent potential dtype mismatches, this change should be applied consistently to all other layers instantiated within the __init__ method.

For example, the keras.layers.Embedding layers and the layers inside mask_downscaler are also missing the dtype argument. Without this, the model might not function correctly when using bfloat16 precision.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have tested it, just need to modify it here.


self.foreground_point_embed = keras.layers.Embedding(
Expand Down
Loading