You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,could you please tell me how to use the pretrained models(flatnet_separable_pointGrey_randomInit or flatnet_separable_pointGrey_transposeInit) to train FlatNet-sep. Thank you!
The text was updated successfully, but these errors were encountered:
run main.py to train from scratch may not need the pretrain models, but when i training, there is error: ../aten/src/ATen/native/cuda/Loss.cu:95: operator(): block: [0,0,0], thread: [2,0,0] Assertion target_val >= zero && target_val <= one failed.
Traceback (most recent call last):
File "main.py", line 156, in
disc_err = train_discriminator_epoch(gen, dis, optim_dis, dis_criterion, train_loader, opt.disPreEpochs, disc_err, device)
File "/root/autodl-tmp/Flatnet/flatnet-flatnet-sep/fns_all.py", line 36, in train_discriminator_epoch
dis_loss = criterion(dis(high_res_real), target_real) + criterion(dis(Variable(high_res_fake.data)), target_fake)
File "/root/miniconda3/envs/flatnet/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/flatnet/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/flatnet/lib/python3.8/site-packages/torch/nn/modules/loss.py", line 621, in forward
return F.binary_cross_entropy(input, target, weight=self.weight, reduction=self.reduction)
File "/root/miniconda3/envs/flatnet/lib/python3.8/site-packages/torch/nn/functional.py", line 3172, in binary_cross_entropy
return torch._C._nn.binary_cross_entropy(input, target, weight, reduction_enum)
Hello,could you please tell me how to use the pretrained models(flatnet_separable_pointGrey_randomInit or flatnet_separable_pointGrey_transposeInit) to train FlatNet-sep. Thank you!
The text was updated successfully, but these errors were encountered: