Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autodisable xformers if using Torch2 in readme example #35

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

kabachuha
Copy link

It's really strange to ask the users (who may not be well versed in coding) to perform a code editing action if it can be accomplished with a simple check

It's really strange to ask the users to perform a code editing action if it can be accomplished with a simple check
@patrickvonplaten
Copy link
Collaborator

Hmm, guess this becomes a bit philosophical, but we also want to explain, show to users how things work and how they can best make use of existing tools instead of doing everything auto-magically.

It's a bit related to the PyTorch philosophy of "Simple over Easy": https://pytorch.org/docs/stable/community/design.html#principle-2-simple-over-easy

To better understand why it's not need for PT 2.0 here one can have a look at:
https://pytorch.org/blog/accelerated-diffusers-pt-20/

@kabachuha
Copy link
Author

I believe the advanced users will read the code comment anyway, but it's creating additional barriers for people who will just want to run the testing snippet and be satisfied with it. In contrast to your comment, the readme gives no explaination on why it should not be used. This only leads to further mystifying the AI tech, imho

Imagine if it was the case for larger projects such as StableDiffusion or text2video models where the users would be required to modify each attention block for the thing to work on their pcs

@kabachuha
Copy link
Author

kabachuha commented Apr 28, 2023

Some users may not be even knowing that their torch version is. To check it they will have to either do some command-line kung-fu or even make a new python file where they will have to add

import torch
print(torch.__version__)

and then execute it in commandline as well. They will need to look up the solutions in the internet as well. And it's simply an unneeded frustration for common folks what will either do the things above or wait for a GUI tool. These things will spoil their initial exposure to the product

@hithereai
Copy link

I'm with @kabachuha, it makes no sense to have the weird comment about the necessity to edit attention blocks.

@patrickvonplaten
Copy link
Collaborator

Ok for me to change, up to the maintainers of this repo :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants