-
Notifications
You must be signed in to change notification settings - Fork 27.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding RTDETRv2 #34773
Adding RTDETRv2 #34773
Conversation
@jadechoghari Hi thanks for working this, original PR for RTDetr-v2 can be found here. Perhaps you can fill the blank based on this |
Hi @SangbumChoi, yes - this is a new PR since we're using a new concept called modular. |
Great. FYI I also have confirmed that there is slight tolerance issue of converting v2 weight since it has difference in MSDA. |
yes nw, I checked your PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey! Here is a first round or reviews! 🤗 I hope it clears some confusions about modular!
Also, let's be consistent with the library: let's rename the files and folders with rt_detr_v2
instead of rtdetrv2
, and use CamelCase for class names, i.e. RtDetrV2Class
instead of RTDetrv2Class
.
great! - i'll update accordingly - thx! |
Hey @Cyrilvallez, lmk if anything else needs to be updated for Modular - i've incorporated all your suggestions and fixes! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok! As discussed in Slack and this review, let's first take care of all the naming conventions in the folder/file/class names 🤗
Also, add/modify all the needed remaining files!
Then, please run
make fixup
python modular_model_converter.py --files_to_parse src/transformers/models/rt_detr_v2/modular_rt_detr_v2.py
from the root of the transformers
repo to take care of the coding style and convert your modular!
Once this is done, we'll discuss potential improvements for the modular itself!
Changes updated @Cyrilvallez ! utils/check_modular_conversion.py
Traceback (most recent call last):
File "/home/user/app/transformers/utils/check_modular_conversion.py", line 75, in <module>
non_matching_files += compare_files(modular_file_path, args.fix_and_overwrite)
File "/home/user/app/transformers/utils/check_modular_conversion.py", line 55, in compare_files
generated_modeling_content = convert_modular_file(modular_file_path)
File "/home/user/app/transformers/utils/modular_model_converter.py", line 1447, in convert_modular_file
for file, module in create_modules(cst_transformers).items():
File "/home/user/app/transformers/utils/modular_model_converter.py", line 1387, in create_modules
nodes_to_add, file_type, new_imports = get_class_node_and_dependencies(modular_mapper, class_name, node, files)
File "/home/user/app/transformers/utils/modular_model_converter.py", line 1348, in get_class_node_and_dependencies
relative_dependency_order = mapper.compute_relative_order(all_dependencies_to_add)
File "/home/user/app/transformers/utils/modular_model_converter.py", line 739, in compute_relative_order
remaining_dependencies.remove(class_name)
KeyError: 'RtDetrV2MultiheadAttention'
make: *** [Makefile:39: repo-consistency] Error 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey! Indeed because the order of classes in modeling_rt_detr.py
is messy (some class are used before being defined), the modular was failing! You can pull from #35056 for the fix
See comments, we're getting there! 🤗
EDIT: also, please run make fixup
to clean-up the files a bit 🙏
let's re-run the failed job as it failed because of a network issue apparently :) |
run-slow: rt_detr_v2 |
This comment contains run-slow, running the specified jobs: ['models/rt_detr_v2'] ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for working on the model! A few comments on my side:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
feel free to merge when you are happy @qubvel
Hey @jadechoghari, I see the logits in tests are changed significantly after removing custom kernel attention, please, ensure it matches with the original implementation. |
@qubvel ofc, i've added a notebook at the top of the PR to replicate the original author logits which matches the one in the testing :)! |
run-slow: rt_detr, rt_detr_v2 |
This comment contains run-slow, running the specified jobs: This comment contains run-slow, running the specified jobs: models: ['models/rt_detr', 'models/rt_detr_v2'] |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
@jadechoghari congratulations on the second model merged 🤗 great work 🚀 |
* cookiecutter add rtdetrv2 * make modular working * working modelgit add . * working modelgit add . * finalize moduar inheritence * finalize moduar inheritence * Update src/transformers/models/rtdetrv2/modular_rtdetrv2.py Co-authored-by: Cyril Vallez <[email protected]> * update modular and add rename * remove output ckpt * define loss_kwargs * fix CamelCase naming * fix naming + files * fix modular and convert file * additional changes * fix modular * fix import error (switch to lazy) * fix autobackbone * make style * add * update testing * fix loss * remove old folder * fix testing for v2 * update docstring * fix docstring * add resnetv2 (with modular bug to fix) * remove resnetv2 backbone * fix changes * small fixes * remove rtdetrv2resnetconfig * add rtdetrv2 name to convert * make style * Update docs/source/en/model_doc/rt_detr_v2.md Co-authored-by: Steven Liu <[email protected]> * Update src/transformers/models/rt_detr_v2/modular_rt_detr_v2.py Co-authored-by: Steven Liu <[email protected]> * Update src/transformers/models/rt_detr_v2/modular_rt_detr_v2.py Co-authored-by: Steven Liu <[email protected]> * fix modular typo after review * add reviewed changes * add final review changes * Update docs/source/en/model_doc/rt_detr_v2.md Co-authored-by: Cyril Vallez <[email protected]> * Update src/transformers/models/rt_detr_v2/__init__.py Co-authored-by: Cyril Vallez <[email protected]> * Update src/transformers/models/rt_detr_v2/convert_rt_detr_v2_weights_to_hf.py Co-authored-by: Cyril Vallez <[email protected]> * add review changes * remove rtdetrv2 resnet * removing this weird project change * change ckpt name from jadechoghari to author * implement review and update testing * update naming and remove wrong ckpt * name * make fix-copies * Fix RT-DETR loss * Add resources, fix name * Fix repo in docs * Fix table name --------- Co-authored-by: jadechoghari <[email protected]> Co-authored-by: Cyril Vallez <[email protected]> Co-authored-by: Steven Liu <[email protected]> Co-authored-by: qubvel <[email protected]>
What does this PR do?
This PR add RTDETRv2 into the Transformers library. There is a new thing in transformers called modular, which adds new models by creating a
modeling_modelname.py
file. Since RTDETRv2 only updates the decoder part while keeping the rest of the model unchanged, it serves as an ideal use case for this modular approach.What’s Left:
scratch
folder (auto-generated by theadd-model
cookie cutter)Colab to replicate original author logits: https://colab.research.google.com/drive/1Vql-9JuFKz7N7l83NmHPP2E1ZyGZnpzX?usp=sharing