forked from GarvitBanga/RoboSignature
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathgradual_randomkeys_outputlogs..out
57 lines (54 loc) · 20.4 KB
/
gradual_randomkeys_outputlogs..out
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
No module 'xformers'. Proceeding without it.
__git__:sha: 2491dca253e102b0ab7ec586f60c1386867d2171, status: clean, branch: garvit1
__log__:{"train_dir": "Tamper_Resistant_Stable_Signature/train2014500/", "val_dir": "Tamper_Resistant_Stable_Signature/test2014/", "ldm_config": "Tamper_Resistant_Stable_Signature/stable-diffusion-2-1/v2-inference.yaml", "ldm_ckpt": "Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt", "msg_decoder_path": "Tamper_Resistant_Stable_Signature/models/dec_48b_whit.torchscript.pt", "num_bits": 48, "redundancy": 1, "decoder_depth": 8, "decoder_channels": 64, "batch_size": 4, "img_size": 256, "loss_i": "watson-vgg", "loss_w": "bce", "lambda_i": 0.2, "lambda_w": 1.0, "optimizer": "AdamW,lr=5e-4", "steps": 100, "warmup_steps": 20, "log_freq": 10, "save_img_freq": 1000, "num_keys": 1, "output_dir": "output/", "seed": 0, "debug": false, "strategy": 2}
>>> Building LDM model with config Tamper_Resistant_Stable_Signature/stable-diffusion-2-1/v2-inference.yaml and weights from Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt...
Loading model from Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt
Global Step: 220000
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 865.91 M params.
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
loaded LDM decoder state_dict with message
_IncompatibleKeys(missing_keys=['encoder.conv_in.weight', 'encoder.conv_in.bias', 'encoder.down.0.block.0.norm1.weight', 'encoder.down.0.block.0.norm1.bias', 'encoder.down.0.block.0.conv1.weight', 'encoder.down.0.block.0.conv1.bias', 'encoder.down.0.block.0.norm2.weight', 'encoder.down.0.block.0.norm2.bias', 'encoder.down.0.block.0.conv2.weight', 'encoder.down.0.block.0.conv2.bias', 'encoder.down.0.block.1.norm1.weight', 'encoder.down.0.block.1.norm1.bias', 'encoder.down.0.block.1.conv1.weight', 'encoder.down.0.block.1.conv1.bias', 'encoder.down.0.block.1.norm2.weight', 'encoder.down.0.block.1.norm2.bias', 'encoder.down.0.block.1.conv2.weight', 'encoder.down.0.block.1.conv2.bias', 'encoder.down.0.downsample.conv.weight', 'encoder.down.0.downsample.conv.bias', 'encoder.down.1.block.0.norm1.weight', 'encoder.down.1.block.0.norm1.bias', 'encoder.down.1.block.0.conv1.weight', 'encoder.down.1.block.0.conv1.bias', 'encoder.down.1.block.0.norm2.weight', 'encoder.down.1.block.0.norm2.bias', 'encoder.down.1.block.0.conv2.weight', 'encoder.down.1.block.0.conv2.bias', 'encoder.down.1.block.0.nin_shortcut.weight', 'encoder.down.1.block.0.nin_shortcut.bias', 'encoder.down.1.block.1.norm1.weight', 'encoder.down.1.block.1.norm1.bias', 'encoder.down.1.block.1.conv1.weight', 'encoder.down.1.block.1.conv1.bias', 'encoder.down.1.block.1.norm2.weight', 'encoder.down.1.block.1.norm2.bias', 'encoder.down.1.block.1.conv2.weight', 'encoder.down.1.block.1.conv2.bias', 'encoder.down.1.downsample.conv.weight', 'encoder.down.1.downsample.conv.bias', 'encoder.down.2.block.0.norm1.weight', 'encoder.down.2.block.0.norm1.bias', 'encoder.down.2.block.0.conv1.weight', 'encoder.down.2.block.0.conv1.bias', 'encoder.down.2.block.0.norm2.weight', 'encoder.down.2.block.0.norm2.bias', 'encoder.down.2.block.0.conv2.weight', 'encoder.down.2.block.0.conv2.bias', 'encoder.down.2.block.0.nin_shortcut.weight', 'encoder.down.2.block.0.nin_shortcut.bias', 'encoder.down.2.block.1.norm1.weight', 'encoder.down.2.block.1.norm1.bias', 'encoder.down.2.block.1.conv1.weight', 'encoder.down.2.block.1.conv1.bias', 'encoder.down.2.block.1.norm2.weight', 'encoder.down.2.block.1.norm2.bias', 'encoder.down.2.block.1.conv2.weight', 'encoder.down.2.block.1.conv2.bias', 'encoder.down.2.downsample.conv.weight', 'encoder.down.2.downsample.conv.bias', 'encoder.down.3.block.0.norm1.weight', 'encoder.down.3.block.0.norm1.bias', 'encoder.down.3.block.0.conv1.weight', 'encoder.down.3.block.0.conv1.bias', 'encoder.down.3.block.0.norm2.weight', 'encoder.down.3.block.0.norm2.bias', 'encoder.down.3.block.0.conv2.weight', 'encoder.down.3.block.0.conv2.bias', 'encoder.down.3.block.1.norm1.weight', 'encoder.down.3.block.1.norm1.bias', 'encoder.down.3.block.1.conv1.weight', 'encoder.down.3.block.1.conv1.bias', 'encoder.down.3.block.1.norm2.weight', 'encoder.down.3.block.1.norm2.bias', 'encoder.down.3.block.1.conv2.weight', 'encoder.down.3.block.1.conv2.bias', 'encoder.mid.block_1.norm1.weight', 'encoder.mid.block_1.norm1.bias', 'encoder.mid.block_1.conv1.weight', 'encoder.mid.block_1.conv1.bias', 'encoder.mid.block_1.norm2.weight', 'encoder.mid.block_1.norm2.bias', 'encoder.mid.block_1.conv2.weight', 'encoder.mid.block_1.conv2.bias', 'encoder.mid.attn_1.norm.weight', 'encoder.mid.attn_1.norm.bias', 'encoder.mid.attn_1.q.weight', 'encoder.mid.attn_1.q.bias', 'encoder.mid.attn_1.k.weight', 'encoder.mid.attn_1.k.bias', 'encoder.mid.attn_1.v.weight', 'encoder.mid.attn_1.v.bias', 'encoder.mid.attn_1.proj_out.weight', 'encoder.mid.attn_1.proj_out.bias', 'encoder.mid.block_2.norm1.weight', 'encoder.mid.block_2.norm1.bias', 'encoder.mid.block_2.conv1.weight', 'encoder.mid.block_2.conv1.bias', 'encoder.mid.block_2.norm2.weight', 'encoder.mid.block_2.norm2.bias', 'encoder.mid.block_2.conv2.weight', 'encoder.mid.block_2.conv2.bias', 'encoder.norm_out.weight', 'encoder.norm_out.bias', 'encoder.conv_out.weight', 'encoder.conv_out.bias', 'quant_conv.weight', 'quant_conv.bias'], unexpected_keys=[])
you should check that the decoder keys are correctly matched
>>> Building hidden decoder with weights from Tamper_Resistant_Stable_Signature/models/dec_48b_whit.torchscript.pt...
>>> Loading data from Tamper_Resistant_Stable_Signature/train2014500/ and Tamper_Resistant_Stable_Signature/test2014/...
>>> Creating losses...
Losses: bce and watson-vgg...
>>> Creating key with 48 bits...
Key: 111010110101000001010111010011010100010000100111
>>> Training...
{"iteration": 0, "loss": 0.15569673478603363, "loss_w": 0.15569467842578888, "loss_i": 1.0295131687598769e-05, "psnr": Infinity, "bit_acc_avg": 0.9791666865348816, "word_acc_avg": 0.0, "lr": 0.0}
Train [ 0/100] eta: 0:05:13 iteration: 0.000000 (0.000000) loss: 0.155697 (0.155697) loss_w: 0.155695 (0.155695) loss_i: 0.000010 (0.000010) psnr: inf (inf) bit_acc_avg: 0.979167 (0.979167) word_acc_avg: 0.000000 (0.000000) lr: 0.000000 (0.000000) time: 3.138910 data: 0.190289 max mem: 11120
{"iteration": 10, "loss": 1.6047399044036865, "loss_w": 0.9724656939506531, "loss_i": 3.1613712310791016, "psnr": 28.236757278442383, "bit_acc_avg": 0.40625, "word_acc_avg": 0.0, "lr": 0.00025}
Train [ 10/100] eta: 0:01:35 iteration: 5.000000 (5.000000) loss: 1.197405 (1.119734) loss_w: 0.940237 (0.894067) loss_i: 1.025835 (1.128335) psnr: 40.697350 (inf) bit_acc_avg: 0.703125 (0.694602) word_acc_avg: 0.000000 (0.000000) lr: 0.000125 (0.000125) time: 1.056292 data: 0.017422 max mem: 11744
{"iteration": 20, "loss": 1.5279508829116821, "loss_w": 0.7617127299308777, "loss_i": 3.831190586090088, "psnr": 25.166099548339844, "bit_acc_avg": 0.5104166865348816, "word_acc_avg": 0.0, "lr": 0.0005}
Train [ 20/100] eta: 0:01:07 iteration: 10.000000 (10.000000) loss: 1.527951 (1.340257) loss_w: 0.828636 (0.841645) loss_i: 3.161371 (2.493056) psnr: 28.064470 (inf) bit_acc_avg: 0.552083 (0.599454) word_acc_avg: 0.000000 (0.000000) lr: 0.000250 (0.000250) time: 0.725244 data: 0.000136 max mem: 11744
{"iteration": 30, "loss": 1.4638421535491943, "loss_w": 0.7641315460205078, "loss_i": 3.4985525608062744, "psnr": 23.73548698425293, "bit_acc_avg": 0.5104166865348816, "word_acc_avg": 0.0, "lr": 0.00048100794336156604}
Train [ 30/100] eta: 0:00:53 iteration: 20.000000 (15.000000) loss: 1.546970 (1.395401) loss_w: 0.772154 (0.819381) loss_i: 3.879369 (2.880104) psnr: 25.136906 (inf) bit_acc_avg: 0.484375 (0.564852) word_acc_avg: 0.000000 (0.000000) lr: 0.000481 (0.000328) time: 0.602853 data: 0.000140 max mem: 11744
{"iteration": 40, "loss": 1.4644982814788818, "loss_w": 0.7459887266159058, "loss_i": 3.592548131942749, "psnr": 28.041587829589844, "bit_acc_avg": 0.4479166865348816, "word_acc_avg": 0.0, "lr": 0.0004269231419060436}
Train [ 40/100] eta: 0:00:43 iteration: 30.000000 (20.000000) loss: 1.478871 (1.413704) loss_w: 0.763969 (0.802372) loss_i: 3.619943 (3.056661) psnr: 24.943491 (inf) bit_acc_avg: 0.473958 (0.544207) word_acc_avg: 0.000000 (0.000000) lr: 0.000477 (0.000359) time: 0.603463 data: 0.000140 max mem: 11744
{"iteration": 50, "loss": 1.4064468145370483, "loss_w": 0.7779983282089233, "loss_i": 3.142242431640625, "psnr": 26.433094024658203, "bit_acc_avg": 0.5208333730697632, "word_acc_avg": 0.0, "lr": 0.00034597951637508993}
Train [ 50/100] eta: 0:00:35 iteration: 40.000000 (25.000000) loss: 1.437001 (1.414791) loss_w: 0.745989 (0.792249) loss_i: 3.526440 (3.112713) psnr: 25.281979 (inf) bit_acc_avg: 0.479167 (0.533905) word_acc_avg: 0.000000 (0.000000) lr: 0.000420 (0.000364) time: 0.603752 data: 0.000137 max mem: 11744
{"iteration": 60, "loss": 1.2924052476882935, "loss_w": 0.7290522456169128, "loss_i": 2.8167648315429688, "psnr": 26.947566986083984, "bit_acc_avg": 0.4947916865348816, "word_acc_avg": 0.0, "lr": 0.0002505}
Train [ 60/100] eta: 0:00:27 iteration: 50.000000 (30.000000) loss: 1.366486 (1.399923) loss_w: 0.733695 (0.781105) loss_i: 3.142242 (3.094090) psnr: 26.036518 (inf) bit_acc_avg: 0.494792 (0.529201) word_acc_avg: 0.000000 (0.000000) lr: 0.000337 (0.000352) time: 0.604581 data: 0.000137 max mem: 11744
{"iteration": 70, "loss": 1.2715144157409668, "loss_w": 0.742590606212616, "loss_i": 2.6446192264556885, "psnr": 27.936569213867188, "bit_acc_avg": 0.4375, "word_acc_avg": 0.0, "lr": 0.0001550204836249101}
Train [ 70/100] eta: 0:00:20 iteration: 60.000000 (35.000000) loss: 1.334868 (1.387794) loss_w: 0.729052 (0.774298) loss_i: 2.956598 (3.067479) psnr: 26.712069 (inf) bit_acc_avg: 0.494792 (0.522594) word_acc_avg: 0.000000 (0.000000) lr: 0.000241 (0.000331) time: 0.605485 data: 0.000143 max mem: 11744
{"iteration": 80, "loss": 1.2906908988952637, "loss_w": 0.7919754981994629, "loss_i": 2.493576765060425, "psnr": 28.791383743286133, "bit_acc_avg": 0.4270833432674408, "word_acc_avg": 0.0, "lr": 7.40768580939564e-05}
Train [ 80/100] eta: 0:00:13 iteration: 70.000000 (40.000000) loss: 1.278248 (1.370997) loss_w: 0.719738 (0.768458) loss_i: 2.774311 (3.012695) psnr: 27.374832 (inf) bit_acc_avg: 0.515625 (0.520255) word_acc_avg: 0.000000 (0.000000) lr: 0.000146 (0.000303) time: 0.605617 data: 0.000145 max mem: 11744
{"iteration": 90, "loss": 1.2885313034057617, "loss_w": 0.762749195098877, "loss_i": 2.628910541534424, "psnr": 26.954853057861328, "bit_acc_avg": 0.484375, "word_acc_avg": 0.0, "lr": 1.9992056638433958e-05}
Train [ 90/100] eta: 0:00:06 iteration: 80.000000 (45.000000) loss: 1.248925 (1.355035) loss_w: 0.714869 (0.762318) loss_i: 2.522770 (2.963589) psnr: 27.760883 (inf) bit_acc_avg: 0.520833 (0.520490) word_acc_avg: 0.000000 (0.000000) lr: 0.000067 (0.000274) time: 0.605702 data: 0.000141 max mem: 11744
Train [ 99/100] eta: 0:00:00 iteration: 89.000000 (49.500000) loss: 1.223005 (1.341235) loss_w: 0.714869 (0.758030) loss_i: 2.493577 (2.916026) psnr: 27.947460 (inf) bit_acc_avg: 0.500000 (0.521354) word_acc_avg: 0.000000 (0.000000) lr: 0.000020 (0.000250) time: 0.606358 data: 0.000136 max mem: 11744
Train Total time: 0:01:05 (0.648405 s / it)
Averaged train stats: iteration: 89.000000 (49.500000) loss: 1.223005 (1.341235) loss_w: 0.714869 (0.758030) loss_i: 2.493577 (2.916026) psnr: 27.947460 (inf) bit_acc_avg: 0.500000 (0.521354) word_acc_avg: 0.000000 (0.000000) lr: 0.000020 (0.000250)
Eval [ 0/63] eta: 0:05:15 iteration: 0.000000 (0.000000) psnr: 28.542006 (28.542006) bit_acc_none: 0.550781 (0.550781) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.557292 (0.557292) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.545573 (0.545573) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.485677 (0.485677) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.428385 (0.428385) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.518229 (0.518229) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.527344 (0.527344) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.529948) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.528646 (0.528646) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.524740 (0.524740) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.500000 (0.500000) word_acc_jpeg_50: 0.000000 (0.000000) time: 5.008920 data: 0.357805 max mem: 11744
Eval [10/63] eta: 0:01:33 iteration: 5.000000 (5.000000) psnr: 27.976208 (28.173924) bit_acc_none: 0.550781 (0.549361) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.549479 (0.548295) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.565104 (0.566406) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.497396 (0.502012) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.444010 (0.436553) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.514323 (0.506629) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.531250 (0.541193) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.532079) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.526042 (0.531132) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.528646 (0.531368) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.516927 (0.515625) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.759899 data: 0.032664 max mem: 11744
Eval [20/63] eta: 0:01:09 iteration: 10.000000 (10.000000) psnr: 27.857563 (28.021709) bit_acc_none: 0.541667 (0.543837) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.540365 (0.545449) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.557292 (0.556114) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.497396 (0.496838) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.425781 (0.430928) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.505208 (0.508185) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.529948 (0.531808) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.537388) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.533358) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.522135 (0.525422) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.519531 (0.519779) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.436497 data: 0.000150 max mem: 11744
Eval [30/63] eta: 0:00:51 iteration: 20.000000 (15.000000) psnr: 27.936489 (28.055766) bit_acc_none: 0.537760 (0.541667) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.540365 (0.544859) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.548177 (0.552755) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.493490 (0.494120) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.424479 (0.432754) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.507812 (0.508107) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.524740 (0.529570) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.540365 (0.536668) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.531040) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.516927 (0.524740) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.519531 (0.519027) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.438938 data: 0.000153 max mem: 11744
Eval [40/63] eta: 0:00:35 iteration: 30.000000 (20.000000) psnr: 28.076778 (28.070355) bit_acc_none: 0.536458 (0.540460) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.542969 (0.545001) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.548177 (0.552846) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.488281 (0.492060) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.427083 (0.431942) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.505208 (0.505812) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.524740 (0.527376) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.536141) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.530583) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.524740 (0.523342) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.514323 (0.518642) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.439667 data: 0.000156 max mem: 11744
Eval [50/63] eta: 0:00:19 iteration: 40.000000 (25.000000) psnr: 27.966314 (27.983601) bit_acc_none: 0.533854 (0.539292) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.548177 (0.545496) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.552083 (0.553156) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.483073 (0.491396) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.429688 (0.431526) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.500000 (0.505208) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.524740 (0.527522) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.528646 (0.535233) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.530178) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.528646 (0.524203) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.520833 (0.521267) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.441013 data: 0.000162 max mem: 11744
Eval [60/63] eta: 0:00:04 iteration: 50.000000 (30.000000) psnr: 27.888069 (27.984823) bit_acc_none: 0.529948 (0.538379) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.544271 (0.546064) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.550781 (0.552275) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.488281 (0.490907) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.429688 (0.430797) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.498698 (0.504995) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.528646 (0.526831) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.535434) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.530140) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.527344 (0.522840) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.528646 (0.521367) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.441290 data: 0.000152 max mem: 11744
Eval [62/63] eta: 0:00:01 iteration: 52.000000 (31.000000) psnr: 27.878422 (27.971604) bit_acc_none: 0.529948 (0.537864) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.544271 (0.545366) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.550781 (0.551794) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.488281 (0.490865) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.429688 (0.430370) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.498698 (0.504733) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.515625 (0.526434) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.535611) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.529700) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.520833 (0.522425) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.526042 (0.520565) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.411891 data: 0.000147 max mem: 11744
Eval Total time: 0:01:33 (1.464043 s / it)
Averaged eval stats: iteration: 52.000000 (31.000000) psnr: 27.878422 (27.971604) bit_acc_none: 0.529948 (0.537864) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.544271 (0.545366) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.550781 (0.551794) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.488281 (0.490865) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.429688 (0.430370) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.498698 (0.504733) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.515625 (0.526434) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.529948 (0.535611) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.524740 (0.529700) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.520833 (0.522425) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.526042 (0.520565) word_acc_jpeg_50: 0.000000 (0.000000)