You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found a case that makes compel crash, here is a MRE (I did not attempt to find the shortest possible prompt that makes it crash):
using compel==2.0.3
pipe=StableDiffusionXLPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0"
)
prompts= [
"\u20281. Style - (((Character entering the game))), - (((In-game, so a bit casual and simple style))), - Looking to the Right, - Full body must be visible, - (((3 heads tall))) \u20282. Concept \u2028- (((Hero in the zombie apocalypse)))\u2028 - The top is a white T-shirt \u2028- - (((He has a backpack on his back)))\u2028"
]
kwargs= {}
kwargs["tokenizer"] = [
pipe.tokenizer,
pipe.tokenizer_2,
]
kwargs["text_encoder"] = [
pipe.text_encoder,
pipe.text_encoder_2,
]
compel=Compel(**kwargs)
compel(prompts)
The reason seemingly comes from SDXL's tokenizer: look at EmbeddingsProvider.get_token_ids, you'll see that here some lists are empty (after trimming bos and eos), I did not investigate whether this is expected behavior from the tokenizer.
This in turn triggers the creation of the None in your code, specifically in _get_token_ranges_for_fragmentshere. Btw the typing of _get_token_ranges_for_fragments is incorrect in those cases where the return can contain None, so not only ints
Let me know,
The text was updated successfully, but these errors were encountered:
Hi @damian0815 ,
I found a case that makes compel crash, here is a MRE (I did not attempt to find the shortest possible prompt that makes it crash):
using
compel==2.0.3
The reason seemingly comes from SDXL's tokenizer: look at
EmbeddingsProvider.get_token_ids
, you'll see that here some lists are empty (after trimming bos and eos), I did not investigate whether this is expected behavior from the tokenizer.This in turn triggers the creation of the
None
in your code, specifically in_get_token_ranges_for_fragments
here. Btw the typing of_get_token_ranges_for_fragments
is incorrect in those cases where the return can containNone
, so not onlyints
Let me know,
The text was updated successfully, but these errors were encountered: