Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with negative prompt when using non truncated long prompts #74

Open
o5faruk opened this issue Nov 28, 2023 · 2 comments
Open

Issue with negative prompt when using non truncated long prompts #74

o5faruk opened this issue Nov 28, 2023 · 2 comments

Comments

@o5faruk
Copy link

o5faruk commented Nov 28, 2023

self.txt2img_pipe.load_textual_inversion(
        EMBEDDING_PATHS, token=EMBEDDING_TOKENS, local_files_only=True
)

textual_inversion_manager = DiffusersTextualInversionManager(self.txt2img_pipe)


self.compel_proc = Compel(
    tokenizer=self.txt2img_pipe.tokenizer,
    text_encoder=self.txt2img_pipe.text_encoder,
    textual_inversion_manager=textual_inversion_manager,
    truncate_long_prompts=False,
)
if prompt:
    conditioning = self.compel_proc.build_conditioning_tensor(prompt)
    if not negative_prompt:
        negative_prompt = ""  # it's necessary to create an empty prompt - it can also be very long, if you want
    negative_conditioning = self.compel_proc.build_conditioning_tensor(
        negative_prompt
    )
    [
        prompt_embeds,
        negative_prompt_embeds,
    ] = self.compel_proc.pad_conditioning_tensors_to_same_length(
        [conditioning, negative_conditioning]
    )
    ...
    output = pipe(
        prompt_embeds=prompt_embeds,
        negative_prompt_embeds=negative_prompt_embeds,
        guidance_scale=guidance_scale,
        generator=generator,
        num_inference_steps=num_inference_steps,
        **extra_kwargs,
    )

Im having weird issues, all the relevant code is shown above, however, negative_prompt messes up my image results, almost as if negatives are getting mixed up with positives.
Also, this happens only if prompt and negative prompt length exceeds 77 tokens.
extra_kwargs does not contain prompt or negative_prompt so only embeds are passed into pipeline. The pipeline in this case is controlnet text to image

Is it possible that negatives get mixed up into positives in pad_conditioning_tensors_to_same_length function?

This is my image with long negative prompt
image

And this is same seed, same prompt, no negative
image

@Kamillaova
Copy link

@o5faruk see this: #59

@damian0815
Copy link
Owner

yes, it's likely caused by #59

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants