Skip to content

Commit

Permalink
Warn if wrong type is given for Llama export for XNNPACK
Browse files Browse the repository at this point in the history
Followup to #7775
  • Loading branch information
mergennachin committed Feb 4, 2025
1 parent a3455d9 commit 6d60669
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions examples/models/llama/export_llama_lib.py
Original file line number Diff line number Diff line change
Expand Up @@ -674,6 +674,12 @@ def _validate_args(args):
"If you need this feature, please file an issue."
)

if args.xnnpack:
if args.dtype_override not in ["fp32", "fp16"]:
raise ValueError(
f"XNNPACK supports either fp32 or fp16 dtypes only for now. Given {args.dtype_override}."
)


def _export_llama(args) -> LLMEdgeManager: # noqa: C901
_validate_args(args)
Expand Down

0 comments on commit 6d60669

Please sign in to comment.