Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update hipblaslt_supported function to reflect hipblaslt supported architectures #3544

Merged
merged 1 commit into from
Oct 23, 2024

Conversation

ahsan-ca
Copy link
Contributor

@ahsan-ca ahsan-ca commented Oct 21, 2024

Update the architectures to accurately reflect architectures that support hipblaslt.

@ahsan-ca ahsan-ca added the bugfix Fixes a bug found in the code. label Oct 21, 2024
@ahsan-ca ahsan-ca requested review from pfultz2 and kahmed10 October 21, 2024 18:52
@ahsan-ca ahsan-ca self-assigned this Oct 21, 2024
@ahsan-ca ahsan-ca requested a review from causten as a code owner October 21, 2024 18:52
@ahsan-ca ahsan-ca force-pushed the saghir/add-arch-hipblaslt-supported branch from 380a583 to 95701b6 Compare October 21, 2024 18:57
Comment on lines 60 to 62
return ((starts_with(device_name, "gfx9") and device_name >= "gfx908" and
not starts_with(device_name, "gfx10")) ||
starts_with(device_name, "gfx110") ||
starts_with(device_name, "gfx120"));
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From hipblaslt:
Required hardware:
gfx90a card
gfx94x card
gfx110x card

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To Do: confirm if gfx90a is covered by device_name >= "gfx908". Remove not starts_with(device_name, "gfx10".

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link

codecov bot commented Oct 21, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 92.16%. Comparing base (04ac9fc) to head (af78270).
Report is 139 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #3544      +/-   ##
===========================================
- Coverage    92.17%   92.16%   -0.02%     
===========================================
  Files          512      512              
  Lines        21393    21401       +8     
===========================================
+ Hits         19720    19724       +4     
- Misses        1673     1677       +4     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ahsan-ca ahsan-ca force-pushed the saghir/add-arch-hipblaslt-supported branch 2 times, most recently from 6b3c47c to 6fce840 Compare October 21, 2024 19:17
@causten causten requested a review from aarushjain29 October 21, 2024 21:14
@ahsan-ca ahsan-ca force-pushed the saghir/add-arch-hipblaslt-supported branch from 6fce840 to af78270 Compare October 21, 2024 21:40
@ahsan-ca ahsan-ca changed the title Add gfx110x and gfx120x cards to hipblaslt supported architectures Update hipblaslt_supported function to reflect hipblaslt supported architectures Oct 21, 2024
@migraphx-bot
Copy link
Collaborator

Test Batch Rate new
af7827
Rate old
b73def
Diff Compare
torchvision-resnet50 64 3,260.10 3,257.93 0.07%
torchvision-resnet50_fp16 64 6,988.32 6,992.99 -0.07%
torchvision-densenet121 32 2,436.67 2,432.26 0.18%
torchvision-densenet121_fp16 32 4,084.94 4,038.39 1.15%
torchvision-inceptionv3 32 1,639.24 1,638.89 0.02%
torchvision-inceptionv3_fp16 32 2,764.22 2,761.69 0.09%
cadene-inceptionv4 16 776.35 776.39 -0.01%
cadene-resnext64x4 16 811.69 811.37 0.04%
slim-mobilenet 64 7,536.37 7,532.73 0.05%
slim-nasnetalarge 64 211.51 211.42 0.04%
slim-resnet50v2 64 3,503.03 3,507.25 -0.12%
bert-mrpc-onnx 8 1,146.86 1,147.76 -0.08%
bert-mrpc-tf 1 469.43 469.91 -0.10%
pytorch-examples-wlang-gru 1 412.23 514.96 -19.95% 🔴
pytorch-examples-wlang-lstm 1 383.67 386.61 -0.76%
torchvision-resnet50_1 1 770.37 772.05 -0.22%
cadene-dpn92_1 1 408.52 398.73 2.45%
cadene-resnext101_1 1 384.27 383.67 0.16%
onnx-taau-downsample 1 342.76 342.33 0.13%
dlrm-criteoterabyte 1 33.33 33.33 0.00%
dlrm-criteoterabyte_fp16 1 52.73 52.70 0.04%
agentmodel 1 8,355.68 8,056.20 3.72% 🔆
unet_fp16 2 58.90 58.92 -0.04%
resnet50v1_fp16 1 951.23 950.32 0.10%
resnet50v1_int8 1 1,027.69 1,000.02 2.77%
bert_base_cased_fp16 64 1,170.28 1,169.24 0.09%
bert_large_uncased_fp16 32 363.50 363.69 -0.05%
bert_large_fp16 1 200.06 198.89 0.59%
distilgpt2_fp16 16 2,199.93 2,203.09 -0.14%
yolov5s 1 535.85 540.85 -0.93%
tinyllama 1 43.45 43.43 0.04%
vicuna-fastchat 1 174.55 170.64 2.29%
whisper-tiny-encoder 1 417.96 418.21 -0.06%
whisper-tiny-decoder 1 433.58 426.10 1.75%

This build is not recommended to merge 🔴

@migraphx-bot
Copy link
Collaborator


     ✅ bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

     ✅ bert-mrpc-tf: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

     ✅ torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-dpn92_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-resnext101_1: PASSED: MIGraphX meets tolerance

     ✅ dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

     ✅ agentmodel: PASSED: MIGraphX meets tolerance

     ✅ unet: PASSED: MIGraphX meets tolerance

     ✅ resnet50v1: PASSED: MIGraphX meets tolerance

     ✅ bert_base_cased_fp16: PASSED: MIGraphX meets tolerance

🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


     ✅ bert_large: PASSED: MIGraphX meets tolerance

     ✅ yolov5s: PASSED: MIGraphX meets tolerance

     ✅ tinyllama: PASSED: MIGraphX meets tolerance

     ✅ vicuna-fastchat: PASSED: MIGraphX meets tolerance

     ✅ whisper-tiny-encoder: PASSED: MIGraphX meets tolerance

     ✅ whisper-tiny-decoder: PASSED: MIGraphX meets tolerance

     ✅ distilgpt2_fp16: PASSED: MIGraphX meets tolerance

@causten causten merged commit ee7a056 into develop Oct 23, 2024
22 of 23 checks passed
@causten causten deleted the saghir/add-arch-hipblaslt-supported branch October 23, 2024 19:40
sohbodas pushed a commit to sohbodas/AMDMIGraphX that referenced this pull request Oct 28, 2024
lajagapp pushed a commit to lajagapp/AMDMIGraphX that referenced this pull request Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bugfix Fixes a bug found in the code.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants