forked from GarvitBanga/RoboSignature
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathrandomkeys_outputlogs.out
57 lines (54 loc) · 20.4 KB
/
randomkeys_outputlogs.out
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
No module 'xformers'. Proceeding without it.
__git__:sha: 612e47c95e9ca814a5f97e617e98b9d151aeefd6, status: clean, branch: garvit1
__log__:{"train_dir": "Tamper_Resistant_Stable_Signature/train2014500/", "val_dir": "Tamper_Resistant_Stable_Signature/test2014/", "ldm_config": "Tamper_Resistant_Stable_Signature/stable-diffusion-2-1/v2-inference.yaml", "ldm_ckpt": "Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt", "msg_decoder_path": "Tamper_Resistant_Stable_Signature/models/dec_48b_whit.torchscript.pt", "num_bits": 48, "redundancy": 1, "decoder_depth": 8, "decoder_channels": 64, "batch_size": 4, "img_size": 256, "loss_i": "watson-vgg", "loss_w": "bce", "lambda_i": 0.2, "lambda_w": 1.0, "optimizer": "AdamW,lr=5e-4", "steps": 100, "warmup_steps": 20, "log_freq": 10, "save_img_freq": 1000, "num_keys": 1, "output_dir": "output/", "seed": 0, "debug": false, "strategy": 1}
>>> Building LDM model with config Tamper_Resistant_Stable_Signature/stable-diffusion-2-1/v2-inference.yaml and weights from Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt...
Loading model from Tamper_Resistant_Stable_Signature/stable-diffusion-2-1-base/v2-1_512-ema-pruned.ckpt
Global Step: 220000
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 865.91 M params.
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
loaded LDM decoder state_dict with message
_IncompatibleKeys(missing_keys=['encoder.conv_in.weight', 'encoder.conv_in.bias', 'encoder.down.0.block.0.norm1.weight', 'encoder.down.0.block.0.norm1.bias', 'encoder.down.0.block.0.conv1.weight', 'encoder.down.0.block.0.conv1.bias', 'encoder.down.0.block.0.norm2.weight', 'encoder.down.0.block.0.norm2.bias', 'encoder.down.0.block.0.conv2.weight', 'encoder.down.0.block.0.conv2.bias', 'encoder.down.0.block.1.norm1.weight', 'encoder.down.0.block.1.norm1.bias', 'encoder.down.0.block.1.conv1.weight', 'encoder.down.0.block.1.conv1.bias', 'encoder.down.0.block.1.norm2.weight', 'encoder.down.0.block.1.norm2.bias', 'encoder.down.0.block.1.conv2.weight', 'encoder.down.0.block.1.conv2.bias', 'encoder.down.0.downsample.conv.weight', 'encoder.down.0.downsample.conv.bias', 'encoder.down.1.block.0.norm1.weight', 'encoder.down.1.block.0.norm1.bias', 'encoder.down.1.block.0.conv1.weight', 'encoder.down.1.block.0.conv1.bias', 'encoder.down.1.block.0.norm2.weight', 'encoder.down.1.block.0.norm2.bias', 'encoder.down.1.block.0.conv2.weight', 'encoder.down.1.block.0.conv2.bias', 'encoder.down.1.block.0.nin_shortcut.weight', 'encoder.down.1.block.0.nin_shortcut.bias', 'encoder.down.1.block.1.norm1.weight', 'encoder.down.1.block.1.norm1.bias', 'encoder.down.1.block.1.conv1.weight', 'encoder.down.1.block.1.conv1.bias', 'encoder.down.1.block.1.norm2.weight', 'encoder.down.1.block.1.norm2.bias', 'encoder.down.1.block.1.conv2.weight', 'encoder.down.1.block.1.conv2.bias', 'encoder.down.1.downsample.conv.weight', 'encoder.down.1.downsample.conv.bias', 'encoder.down.2.block.0.norm1.weight', 'encoder.down.2.block.0.norm1.bias', 'encoder.down.2.block.0.conv1.weight', 'encoder.down.2.block.0.conv1.bias', 'encoder.down.2.block.0.norm2.weight', 'encoder.down.2.block.0.norm2.bias', 'encoder.down.2.block.0.conv2.weight', 'encoder.down.2.block.0.conv2.bias', 'encoder.down.2.block.0.nin_shortcut.weight', 'encoder.down.2.block.0.nin_shortcut.bias', 'encoder.down.2.block.1.norm1.weight', 'encoder.down.2.block.1.norm1.bias', 'encoder.down.2.block.1.conv1.weight', 'encoder.down.2.block.1.conv1.bias', 'encoder.down.2.block.1.norm2.weight', 'encoder.down.2.block.1.norm2.bias', 'encoder.down.2.block.1.conv2.weight', 'encoder.down.2.block.1.conv2.bias', 'encoder.down.2.downsample.conv.weight', 'encoder.down.2.downsample.conv.bias', 'encoder.down.3.block.0.norm1.weight', 'encoder.down.3.block.0.norm1.bias', 'encoder.down.3.block.0.conv1.weight', 'encoder.down.3.block.0.conv1.bias', 'encoder.down.3.block.0.norm2.weight', 'encoder.down.3.block.0.norm2.bias', 'encoder.down.3.block.0.conv2.weight', 'encoder.down.3.block.0.conv2.bias', 'encoder.down.3.block.1.norm1.weight', 'encoder.down.3.block.1.norm1.bias', 'encoder.down.3.block.1.conv1.weight', 'encoder.down.3.block.1.conv1.bias', 'encoder.down.3.block.1.norm2.weight', 'encoder.down.3.block.1.norm2.bias', 'encoder.down.3.block.1.conv2.weight', 'encoder.down.3.block.1.conv2.bias', 'encoder.mid.block_1.norm1.weight', 'encoder.mid.block_1.norm1.bias', 'encoder.mid.block_1.conv1.weight', 'encoder.mid.block_1.conv1.bias', 'encoder.mid.block_1.norm2.weight', 'encoder.mid.block_1.norm2.bias', 'encoder.mid.block_1.conv2.weight', 'encoder.mid.block_1.conv2.bias', 'encoder.mid.attn_1.norm.weight', 'encoder.mid.attn_1.norm.bias', 'encoder.mid.attn_1.q.weight', 'encoder.mid.attn_1.q.bias', 'encoder.mid.attn_1.k.weight', 'encoder.mid.attn_1.k.bias', 'encoder.mid.attn_1.v.weight', 'encoder.mid.attn_1.v.bias', 'encoder.mid.attn_1.proj_out.weight', 'encoder.mid.attn_1.proj_out.bias', 'encoder.mid.block_2.norm1.weight', 'encoder.mid.block_2.norm1.bias', 'encoder.mid.block_2.conv1.weight', 'encoder.mid.block_2.conv1.bias', 'encoder.mid.block_2.norm2.weight', 'encoder.mid.block_2.norm2.bias', 'encoder.mid.block_2.conv2.weight', 'encoder.mid.block_2.conv2.bias', 'encoder.norm_out.weight', 'encoder.norm_out.bias', 'encoder.conv_out.weight', 'encoder.conv_out.bias', 'quant_conv.weight', 'quant_conv.bias'], unexpected_keys=[])
you should check that the decoder keys are correctly matched
>>> Building hidden decoder with weights from Tamper_Resistant_Stable_Signature/models/dec_48b_whit.torchscript.pt...
>>> Loading data from Tamper_Resistant_Stable_Signature/train2014500/ and Tamper_Resistant_Stable_Signature/test2014/...
>>> Creating losses...
Losses: bce and watson-vgg...
>>> Creating key with 48 bits...
Key: 111010110101000001010111010011010100010000100111
>>> Training...
{"iteration": 0, "loss": 0.03323621675372124, "loss_w": 0.03323415666818619, "loss_i": 1.0295131687598769e-05, "psnr": Infinity, "bit_acc_avg": 1.0, "word_acc_avg": 1.0, "lr": 0.0}
Train [ 0/100] eta: 0:05:54 iteration: 0.000000 (0.000000) loss: 0.033236 (0.033236) loss_w: 0.033234 (0.033234) loss_i: 0.000010 (0.000010) psnr: inf (inf) bit_acc_avg: 1.000000 (1.000000) word_acc_avg: 1.000000 (1.000000) lr: 0.000000 (0.000000) time: 3.547275 data: 0.247242 max mem: 11120
{"iteration": 10, "loss": 1.54487943649292, "loss_w": 0.8286144137382507, "loss_i": 3.581324815750122, "psnr": 27.848142623901367, "bit_acc_avg": 0.4479166865348816, "word_acc_avg": 0.0, "lr": 0.00025}
Train [ 10/100] eta: 0:01:39 iteration: 5.000000 (5.000000) loss: 1.601341 (1.705747) loss_w: 1.386745 (1.402276) loss_i: 1.583870 (1.517355) psnr: 37.332275 (inf) bit_acc_avg: 0.468750 (0.528883) word_acc_avg: 0.000000 (0.090909) lr: 0.000125 (0.000125) time: 1.107414 data: 0.022601 max mem: 11744
{"iteration": 20, "loss": 1.5672094821929932, "loss_w": 0.876183271408081, "loss_i": 3.4551310539245605, "psnr": 26.549787521362305, "bit_acc_avg": 0.4270833432674408, "word_acc_avg": 0.0, "lr": 0.0005}
Train [ 20/100] eta: 0:01:09 iteration: 10.000000 (10.000000) loss: 1.506612 (1.602944) loss_w: 0.823057 (1.102269) loss_i: 3.313367 (2.503373) psnr: 27.848143 (inf) bit_acc_avg: 0.473958 (0.508433) word_acc_avg: 0.000000 (0.047619) lr: 0.000250 (0.000250) time: 0.732189 data: 0.000135 max mem: 11744
{"iteration": 30, "loss": 1.4066290855407715, "loss_w": 0.7439231276512146, "loss_i": 3.3135297298431396, "psnr": 25.540557861328125, "bit_acc_avg": 0.5, "word_acc_avg": 0.0, "lr": 0.00048100794336156604}
Train [ 30/100] eta: 0:00:54 iteration: 20.000000 (15.000000) loss: 1.445866 (1.540327) loss_w: 0.746913 (0.989581) loss_i: 3.434398 (2.753731) psnr: 26.457727 (inf) bit_acc_avg: 0.500000 (0.509913) word_acc_avg: 0.000000 (0.032258) lr: 0.000481 (0.000328) time: 0.601441 data: 0.000136 max mem: 11744
{"iteration": 40, "loss": 1.4020326137542725, "loss_w": 0.7195958495140076, "loss_i": 3.4121837615966797, "psnr": 27.589523315429688, "bit_acc_avg": 0.46875, "word_acc_avg": 0.0, "lr": 0.0004269231419060436}
Train [ 40/100] eta: 0:00:44 iteration: 30.000000 (20.000000) loss: 1.420033 (1.522219) loss_w: 0.747803 (0.935023) loss_i: 3.347372 (2.935976) psnr: 26.144150 (inf) bit_acc_avg: 0.489583 (0.506606) word_acc_avg: 0.000000 (0.024390) lr: 0.000477 (0.000359) time: 0.602339 data: 0.000135 max mem: 11744
{"iteration": 50, "loss": 1.372377634048462, "loss_w": 0.7374166250228882, "loss_i": 3.174805164337158, "psnr": 27.606460571289062, "bit_acc_avg": 0.53125, "word_acc_avg": 0.0, "lr": 0.00034597951637508993}
Train [ 50/100] eta: 0:00:35 iteration: 40.000000 (25.000000) loss: 1.444227 (1.506718) loss_w: 0.750559 (0.901200) loss_i: 3.467286 (3.027589) psnr: 25.707981 (inf) bit_acc_avg: 0.484375 (0.501838) word_acc_avg: 0.000000 (0.019608) lr: 0.000420 (0.000364) time: 0.603008 data: 0.000136 max mem: 11744
{"iteration": 60, "loss": 1.2731399536132812, "loss_w": 0.7103152275085449, "loss_i": 2.8141231536865234, "psnr": 26.607192993164062, "bit_acc_avg": 0.53125, "word_acc_avg": 0.0, "lr": 0.0002505}
Train [ 60/100] eta: 0:00:27 iteration: 50.000000 (30.000000) loss: 1.372378 (1.476330) loss_w: 0.738780 (0.871875) loss_i: 3.136258 (3.022271) psnr: 26.436747 (inf) bit_acc_avg: 0.494792 (0.505464) word_acc_avg: 0.000000 (0.016393) lr: 0.000337 (0.000352) time: 0.603400 data: 0.000142 max mem: 11744
{"iteration": 70, "loss": 1.2493622303009033, "loss_w": 0.7261756658554077, "loss_i": 2.6159329414367676, "psnr": 27.62194061279297, "bit_acc_avg": 0.5052083730697632, "word_acc_avg": 0.0, "lr": 0.0001550204836249101}
Train [ 70/100] eta: 0:00:20 iteration: 60.000000 (35.000000) loss: 1.334383 (1.455345) loss_w: 0.732089 (0.855273) loss_i: 2.884012 (3.000360) psnr: 26.575844 (inf) bit_acc_avg: 0.500000 (0.503741) word_acc_avg: 0.000000 (0.014085) lr: 0.000241 (0.000331) time: 0.603866 data: 0.000140 max mem: 11744
{"iteration": 80, "loss": 1.1933726072311401, "loss_w": 0.7250257730484009, "loss_i": 2.3417341709136963, "psnr": 28.91046714782715, "bit_acc_avg": 0.5208333730697632, "word_acc_avg": 0.0, "lr": 7.40768580939564e-05}
Train [ 80/100] eta: 0:00:13 iteration: 70.000000 (40.000000) loss: 1.291707 (1.429972) loss_w: 0.727515 (0.839325) loss_i: 2.734622 (2.953236) psnr: 27.389290 (inf) bit_acc_avg: 0.494792 (0.504244) word_acc_avg: 0.000000 (0.012346) lr: 0.000146 (0.000303) time: 0.604435 data: 0.000135 max mem: 11744
{"iteration": 90, "loss": 1.2392908334732056, "loss_w": 0.7365261316299438, "loss_i": 2.5138235092163086, "psnr": 26.852872848510742, "bit_acc_avg": 0.5104166865348816, "word_acc_avg": 0.0, "lr": 1.9992056638433958e-05}
Train [ 90/100] eta: 0:00:06 iteration: 80.000000 (45.000000) loss: 1.238300 (1.408088) loss_w: 0.728658 (0.827334) loss_i: 2.469121 (2.903773) psnr: 27.554543 (inf) bit_acc_avg: 0.505208 (0.504178) word_acc_avg: 0.000000 (0.010989) lr: 0.000067 (0.000274) time: 0.604932 data: 0.000139 max mem: 11744
Train [ 99/100] eta: 0:00:00 iteration: 89.000000 (49.500000) loss: 1.207999 (1.389514) loss_w: 0.728658 (0.818355) loss_i: 2.441096 (2.855794) psnr: 27.922440 (inf) bit_acc_avg: 0.500000 (0.503542) word_acc_avg: 0.000000 (0.010000) lr: 0.000020 (0.000250) time: 0.605199 data: 0.000140 max mem: 11744
Train Total time: 0:01:05 (0.652994 s / it)
Averaged train stats: iteration: 89.000000 (49.500000) loss: 1.207999 (1.389514) loss_w: 0.728658 (0.818355) loss_i: 2.441096 (2.855794) psnr: 27.922440 (inf) bit_acc_avg: 0.500000 (0.503542) word_acc_avg: 0.000000 (0.010000) lr: 0.000020 (0.000250)
Eval [ 0/63] eta: 0:05:18 iteration: 0.000000 (0.000000) psnr: 28.165474 (28.165474) bit_acc_none: 0.555990 (0.555990) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.553385 (0.553385) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.553385 (0.553385) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.514323 (0.514323) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.442708 (0.442708) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.516927 (0.516927) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.522135 (0.522135) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.567708 (0.567708) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.583333 (0.583333) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.532552 (0.532552) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.518229 (0.518229) word_acc_jpeg_50: 0.000000 (0.000000) time: 5.063253 data: 0.402300 max mem: 11744
Eval [10/63] eta: 0:01:33 iteration: 5.000000 (5.000000) psnr: 27.568378 (27.737719) bit_acc_none: 0.569010 (0.569247) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.553385 (0.557765) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.579427 (0.577415) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.509115 (0.511482) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.457031 (0.464844) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.516927 (0.518466) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.548177 (0.548887) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.562500 (0.559896) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.555990 (0.556818) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.533854 (0.537760) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.515625 (0.517637) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.769615 data: 0.036715 max mem: 11744
Eval [20/63] eta: 0:01:09 iteration: 10.000000 (10.000000) psnr: 27.469236 (27.630002) bit_acc_none: 0.566406 (0.565290) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.549479 (0.556548) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.570312 (0.570561) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.509115 (0.508123) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.455729 (0.459573) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.520833 (0.516803) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.549479 (0.548425) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.562500 (0.562996) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.553385 (0.555308) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.533854 (0.534536) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.515625 (0.515873) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.440911 data: 0.000159 max mem: 11744
Eval [30/63] eta: 0:00:51 iteration: 20.000000 (15.000000) psnr: 27.386938 (27.652731) bit_acc_none: 0.562500 (0.565188) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.554688 (0.555864) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.563802 (0.570733) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.503906 (0.505418) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.454427 (0.459047) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.520833 (0.514911) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.544271 (0.546329) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.565104 (0.563970) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.554688 (0.555864) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.535156 (0.535030) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.511719 (0.514575) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.440843 data: 0.000160 max mem: 11744
Eval [40/63] eta: 0:00:35 iteration: 30.000000 (20.000000) psnr: 27.628809 (27.676456) bit_acc_none: 0.561198 (0.563675) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.555990 (0.556657) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.565104 (0.570027) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.509115 (0.507304) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.455729 (0.458492) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.503906 (0.513338) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.540365 (0.545128) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.562500 (0.563008) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.554688 (0.553862) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.529948 (0.533346) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.511719 (0.514513) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.440379 data: 0.000156 max mem: 11744
Eval [50/63] eta: 0:00:19 iteration: 40.000000 (25.000000) psnr: 27.603333 (27.603180) bit_acc_none: 0.558594 (0.564491) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.552083 (0.556066) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.566406 (0.570823) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.509115 (0.506281) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.458333 (0.460146) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.511719 (0.514093) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.542969 (0.546262) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.562500 (0.562960) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.546875 (0.553181) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.529948 (0.533854) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.514323 (0.514527) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.440733 data: 0.000153 max mem: 11744
Eval [60/63] eta: 0:00:04 iteration: 50.000000 (30.000000) psnr: 27.436214 (27.608356) bit_acc_none: 0.561198 (0.563375) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.557292 (0.556096) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.571615 (0.570419) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.500000 (0.506340) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.464844 (0.460276) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.511719 (0.512893) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.544271 (0.543993) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.561198 (0.562372) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.545573 (0.551763) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.531250 (0.533897) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.514323 (0.515070) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.440887 data: 0.000147 max mem: 11744
Eval [62/63] eta: 0:00:01 iteration: 52.000000 (31.000000) psnr: 27.436214 (27.599785) bit_acc_none: 0.561198 (0.563099) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.554688 (0.556010) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.570312 (0.569341) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.500000 (0.506097) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.464844 (0.459553) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.511719 (0.512173) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.542969 (0.543382) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.561198 (0.562273) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.545573 (0.551877) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.532552 (0.533730) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.513021 (0.514757) word_acc_jpeg_50: 0.000000 (0.000000) time: 1.411499 data: 0.000145 max mem: 11744
Eval Total time: 0:01:33 (1.466479 s / it)
Averaged eval stats: iteration: 52.000000 (31.000000) psnr: 27.436214 (27.599785) bit_acc_none: 0.561198 (0.563099) word_acc_none: 0.000000 (0.000000) bit_acc_crop_01: 0.554688 (0.556010) word_acc_crop_01: 0.000000 (0.000000) bit_acc_crop_05: 0.570312 (0.569341) word_acc_crop_05: 0.000000 (0.000000) bit_acc_rot_25: 0.500000 (0.506097) word_acc_rot_25: 0.000000 (0.000000) bit_acc_rot_90: 0.464844 (0.459553) word_acc_rot_90: 0.000000 (0.000000) bit_acc_resize_03: 0.511719 (0.512173) word_acc_resize_03: 0.000000 (0.000000) bit_acc_resize_07: 0.542969 (0.543382) word_acc_resize_07: 0.000000 (0.000000) bit_acc_brightness_1p5: 0.561198 (0.562273) word_acc_brightness_1p5: 0.000000 (0.000000) bit_acc_brightness_2: 0.545573 (0.551877) word_acc_brightness_2: 0.000000 (0.000000) bit_acc_jpeg_80: 0.532552 (0.533730) word_acc_jpeg_80: 0.000000 (0.000000) bit_acc_jpeg_50: 0.513021 (0.514757) word_acc_jpeg_50: 0.000000 (0.000000)