Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

smZ CLIPTextEncode seem easy to get error when the prompt too long or get Other encoded characters or punctuation #45

Open
xueqing0622 opened this issue Jan 3, 2024 · 4 comments

Comments

@xueqing0622
Copy link

smZ CLIPTextEncode seem easy to get error when the prompt too long or get Other encoded characters or punctuation
image

Error occurred when executing smZ CLIPTextEncode:

tuple index out of range

File "F:\ComfyUI\ComfyUI\execution.py", line 154, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "F:\ComfyUI\ComfyUI\execution.py", line 84, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "F:\ComfyUI\ComfyUI\execution.py", line 77, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\nodes.py", line 87, in encode
result = run(**params)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 693, in run
cond, pooled = clip_clone.encode_from_tokens(tokens, True)
File "F:\ComfyUI\ComfyUI\comfy\sd.py", line 131, in encode_from_tokens
cond, pooled = self.cond_stage_model.encode_token_weights(tokens)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 416, in encode_token_weights
if multi: schedules = prompt_parser.get_multicond_learned_conditioning(model_hijack.cond_stage_model, texts, steps, None, opts.use_old_scheduling)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 270, in get_multicond_learned_conditioning
learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps, hires_steps, use_old_scheduling)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 198, in get_learned_conditioning
conds = model.forward(texts)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_clip.py", line 207, in forward
z = self.process_tokens(tokens, multipliers)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\sd_hijack_clip.py", line 237, in process_tokens
z = self.encode_with_transformers(tokens)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 223, in encode_with_transformers
return self.encode_with_transformers_comfy_(tokens, return_pooled)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 124, in encode_with_transformers_comfy_
z, pooled = ClipTextEncoderCustom._forward(self.wrapped, tokens_orig)
File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 113, in _forward
z, pooled_output = self.forward(tokens)
File "F:\ComfyUI\ComfyUI\comfy\sd1_clip.py", line 160, in forward
tokens = self.set_up_textual_embeddings(tokens, backup_embeds)
File "F:\ComfyUI\ComfyUI\comfy\sd1_clip.py", line 131, in set_up_textual_embeddings
if y.shape[0] == current_embeds.weight.shape[1]:

@xueqing0622
Copy link
Author

xueqing0622 commented Jan 3, 2024

And these are error prompts samplers: And I use in A1111 mode
image

1:
1gir,black hair,Night,Standing in the meadow,Blue stars,looking at viewers,
(extremely detailed 8k illustration), (extremely detailed and beautiful background:1.2), ultra detailed painting, professional illustrasion, Ultra-precise depiction, Ultra-detailed depiction, (beautiful and aesthetic:1.2), vivid,intricate, nice hands, perfect hands,

2:
highly detailed black sumi-e painting of . in-depth study of perfection, created by a master. best quality, high resolution

@shiimizu
Copy link
Owner

shiimizu commented Jan 3, 2024

It seems like an embedding was used? Do you have any embeddings that have the same name as one of the words you used?

@xueqing0622
Copy link
Author

shiimizu, you are right! after I remove all the embedding , It works,
How do avoid it happen but I have to keep my embeddings?

@shiimizu
Copy link
Owner

shiimizu commented Jan 4, 2024

Make sure your embeddings are renamed to words you don't often use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants