Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tfdsv4 fix carla configs #1788

Merged
merged 14 commits into from
Dec 23, 2022
Merged

Conversation

lcadalzo
Copy link
Contributor

Note that the carla attacks don't currently work with the new tfdsv4 pipeline. This is because the label keys now have a batch dimension, e.g. the old format that works with the attacks is

(Pdb) y_target[0]['boxes'].shape
(10, 4)

and the new:

(Pdb) y_target[0]['boxes'].shape
(1, 10, 4)

I'm not intending to address that in this PR, so I tested with --skip-attack.

@lcadalzo lcadalzo requested a review from davidslater November 30, 2022 19:20
@lcadalzo lcadalzo changed the title Tfdsv4 fix carla configs WIP: Tfdsv4 fix carla configs Nov 30, 2022
@lcadalzo
Copy link
Contributor Author

marking WIP since I still need to add the dataset's .py file

@lcadalzo lcadalzo changed the title WIP: Tfdsv4 fix carla configs tfdsv4 fix carla configs Nov 30, 2022
@lcadalzo
Copy link
Contributor Author

the only change between carla_obj_det_dev.py added here and that on develop in armory.data is in _split_generators() and is functionally equivalent

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 2, 2022

ready for re-review @davidslater. Some of the preprocessing code this PR touches is being modified in #1797, but I think we can merge this in and make sure to test carla and xview when when reviewing the latter PR

@davidslater
Copy link
Contributor

@lcadalzo can you resolve the merge conflict?

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 6, 2022

@davidslater ready to go. The commits from today are just renaming a function and removing code that got duplicated in the handling of the merge conflict. I've tested both carla_obj_det_dev and carla_over_obj_det_dev from this branch. There's an error in tide calculation, but the other metrics compute fine so I think that's a separate issue

edit: the tide error is occurring on this branch but not develop, so I'm investigating

@davidslater
Copy link
Contributor

The error may be due to changes in develop that happened after we branched to tfdsv4

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 6, 2022

meaning there have been fixes in develop that haven't made their way into tfdsv4 branch yet?

@davidslater
Copy link
Contributor

No, on second look, those occurred before develop was merged into tfdsv4, so I don't know.

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 6, 2022

ah ok, the issue is occurring because similar to this issue, the tfdsv4 pipeline outputs labels with a batch dimension e.g.

(Pdb) y['boxes'].shape
(1, 15, 4)

whereas the code is expecting (15, 4).

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 6, 2022

The reason mAP doesn't fail with a similar error is because of these lines. I think this PR should be fine to merge, and we'll need to add a separate issue to tweak tide calculation to be compatible with this form, since the tf preprocessing will always output with a batch dimension

@jprokos26
Copy link
Contributor

Throwing error when running carla_obj_det_adversarialpatch_undefended.json but url provided is valid, was the cached dataset uploaded as well?

  File "/workspace/armory/datasets/download.py", line 70, in download
    download_file_from_s3(str(key), str(filepath), public=public)
    │                         │         │                 └ True
    │                         │         └ PosixPath('/armory/datasets/new_builds/cache/carla_obj_det_dev__2.0.0__cache.tar.gz')
    │                         └ 'datasets/cache/carla_obj_det_dev__2.0.0__cache.tar.gz'
    └ <function download_file_from_s3 at 0x7f8c2f2b2c10>

  File "/workspace/armory/datasets/download.py", line 38, in download_file_from_s3
    raise KeyError(f"File {key} not available in {bucket} bucket.")

KeyError: 'File datasets/cache/carla_obj_det_dev__2.0.0__cache.tar.gz not available in armory-public-data bucket.'

Copy link
Contributor

@jprokos26 jprokos26 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Convention update.

@lcadalzo
Copy link
Contributor Author

lcadalzo commented Dec 23, 2022

Throwing error when running carla_obj_det_adversarialpatch_undefended.json but url provided is valid, was the cached dataset uploaded as well?

@jprokos26 try again, it looks like the upload to armory-public-data/datasets/cache hadn't occurred, but I just uploaded

@jprokos26
Copy link
Contributor

Now throwing an error at end of run using --skip-attack:

File "/workspace/armory/metrics/task.py", line 940, in object_detection_mAP_tide
tidecv_ground_truth_list = armory_to_tide(y, image_id, is_detection=False)
                           │              │  └ array([69734696, 69734696, 69734696, 69734696, 69734696, 69734696,
                           │              │           69734696, 69734696, 69734696, 69734696, 69734696, 6...
                           │              └ {'area': array([[1869, 3543,  120,  107,  145,  102,  116,  130,  139,  149,  122,
                           │                         154, 3893, 7951,  172]]), 'boxes'...
                           └ <function armory_to_tide at 0x7fb2c23231f0>
File "/workspace/armory/metrics/task.py", line 882, in armory_to_tide
x1, y1, x2, y2 = y["boxes"]
                 └ {'area': array([1869, 3543,  120,  107,  145,  102,  116,  130,  139,  149,  122,
                           154, 3893, 7951,  172]), 'boxes': a...
ValueError: too many values to unpack (expected 4)

@lcadalzo
Copy link
Contributor Author

Now throwing an error at end of run using --skip-attack:

@jprokos26 see this comment and the ones above it. I still think there's a decision to be made here that applies to various datasets: do we modify scenario code to remove the batch dimension, or do we update metrics to expect the batch dimension. I'm deferring on this decision for now and not including a fix for that in this PR (we're tracking via #1814)

@jprokos26
Copy link
Contributor

Now throwing an error at end of run using --skip-attack:

@jprokos26 see this comment and the ones above it. I still think there's a decision to be made here that applies to various datasets: do we modify scenario code to remove the batch dimension, or do we update metrics to expect the batch dimension. I'm deferring on this decision for now and not including a fix for that in this PR (we're tracking via #1814)

Gotcha, then I agree this PR is ready to be merged if that issue is being tracked as the dataset is loading as expected now.

@jprokos26 jprokos26 merged commit 1bcd0ee into twosixlabs:tfdsv4 Dec 23, 2022
@lcadalzo lcadalzo deleted the tfdsv4-fix-carla-configs branch December 23, 2022 20:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants