MultiThreshold out_scale Problem to slove #828
Unanswered
Charlie-ZHIJIE
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi guys,
I meet a problem when I try to deploy a ML model to FPGA using FINN.
The model is simple:
class CustomBrevitasModel(nn.Module):
def init(self, weight_bit_width):
super(CustomBrevitasModel, self).init()
self.dense1 = QuantLinear(4, OUT_SHAPE, bias=True, weight_bit_width=weight_bit_width)
self.relu1 = QuantReLU(quant_type=QuantType.INT, bit_width=4, max_val=6)
I met the problem why I try to transfer them to hls layer, the error shows :
AssertionError Traceback (most recent call last)
Cell In[45], line 5
3 model = model.transform(to_hls.InferQuantizedMatrixVectorActivation(mem_mode))
4 # input quantization (if any) to standalone thresholding
----> 5 model = model.transform(to_hls.InferThresholdingLayer())
6 model = model.transform(to_hls.InferConvInpGen())
7 model = model.transform(to_hls.InferStreamingMaxPool())
finn/deps/qonnx/src/qonnx/core/modelwrapper.py:140, in ModelWrapper.transform(self, transformation, make_deepcopy, cleanup)
138 model_was_changed = True
139 while model_was_changed:
--> 140 (transformed_model, model_was_changed) = transformation.apply(transformed_model)
141 if cleanup:
142 transformed_model.cleanup()
finn/src/finn/transformation/fpgadataflow/convert_to_hls_layers.py:1114, in InferThresholdingLayer.apply(self, model)
1112 odt = model.get_tensor_datatype(thl_output)
1113 scale = getCustomOp(node).get_nodeattr("out_scale")
-> 1114 assert scale == 1.0, (
1115 node.name
1116 + ": MultiThreshold out_scale must be 1 for HLS conversion."
1117 )
1118 actval = getCustomOp(node).get_nodeattr("out_bias")
1119 assert int(actval) == actval, (
1120 node.name
1121 + ": MultiThreshold out_bias must be integer for HLS conversion."
1122 )
AssertionError: MultiThreshold_1: MultiThreshold out_scale must be 1 for HLS conversion.
Any idea to help solve the multithreshold out_scale problem?
thanks,
Zhijie Xu
Beta Was this translation helpful? Give feedback.
All reactions