2022/09/15 18:55:02 - mmengine - INFO - ------------------------------------------------------------ System environment: sys.platform: linux Python: 3.7.13 (default, Mar 29 2022, 02:18:16) [GCC 7.5.0] CUDA available: True numpy_random_seed: 80577238 GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/cache/share/cuda-11.1 NVCC: Cuda compilation tools, release 11.1, V11.1.74 GCC: gcc (GCC) 5.4.0 PyTorch: 1.9.0+cu111 PyTorch compiling details: PyTorch built with: - GCC 7.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.1.2 (Git Hash 98be7e8afa711dc9b66c8ff3504129cb82013cdb) - OpenMP 201511 (a.k.a. OpenMP 4.5) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.1 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.0.5 - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.1, CUDNN_VERSION=8.0.5, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.9.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, TorchVision: 0.10.0+cu111 OpenCV: 4.5.4 MMEngine: 0.1.0 Runtime environment: cudnn_benchmark: True mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl'} seed: None Distributed launcher: slurm Distributed training: True GPU number: 8 ------------------------------------------------------------ 2022/09/15 18:55:05 - mmengine - INFO - Config: mj_rec_data_root = 'data/rec/Syn90k/' mj_rec_train = dict( type='OCRDataset', data_root='data/rec/Syn90k/', data_prefix=dict(img_path='mnt/ramdisk/max/90kDICT32px'), ann_file='train_labels.json', test_mode=False, pipeline=None) mj_sub_rec_train = dict( type='OCRDataset', data_root='data/rec/Syn90k/', data_prefix=dict(img_path='mnt/ramdisk/max/90kDICT32px'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None) st_data_root = 'data/rec/SynthText/' st_rec_train = dict( type='OCRDataset', data_root='data/rec/SynthText/', data_prefix=dict(img_path='synthtext/SynthText_patch_horizontal'), ann_file='train_labels.json', test_mode=False, pipeline=None) st_an_rec_train = dict( type='OCRDataset', data_root='data/rec/SynthText/', data_prefix=dict(img_path='synthtext/SynthText_patch_horizontal'), ann_file='alphanumeric_train_labels.json', test_mode=False, pipeline=None) st_sub_rec_train = dict( type='OCRDataset', data_root='data/rec/SynthText/', data_prefix=dict(img_path='synthtext/SynthText_patch_horizontal'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None) st_add_rec_data_root = 'data/rec/synthtext_add/' st_add_rec_train = dict( type='OCRDataset', data_root='data/rec/synthtext_add/', ann_file='train_labels.json', test_mode=False, pipeline=None) cocov1_rec_train_data_root = 'data/rec/coco_text_v1' cocov1_rec_train = dict( type='OCRDataset', data_root='data/rec/coco_text_v1', ann_file='train_labels.json', test_mode=False, pipeline=None) cute80_rec_data_root = 'data/rec/ct80/' cute80_rec_test = dict( type='OCRDataset', data_root='data/rec/ct80/', ann_file='test_labels.json', test_mode=True, pipeline=None) iiit5k_rec_data_root = 'data/rec/IIIT5K/' iiit5k_rec_train = dict( type='OCRDataset', data_root='data/rec/IIIT5K/', ann_file='train_labels.json', test_mode=False, pipeline=None) iiit5k_rec_test = dict( type='OCRDataset', data_root='data/rec/IIIT5K/', ann_file='test_labels.json', test_mode=True, pipeline=None) svt_rec_data_root = 'data/rec/svt/' svt_rec_test = dict( type='OCRDataset', data_root='data/rec/svt/', ann_file='test_labels.json', test_mode=True, pipeline=None) svtp_rec_data_root = 'data/rec/svtp/' svtp_rec_test = dict( type='OCRDataset', data_root='data/rec/svtp/', ann_file='test_labels.json', test_mode=True, pipeline=None) ic11_rec_data_root = 'data/rec/icdar_2011/' ic11_rec_train = dict( type='OCRDataset', data_root='data/rec/icdar_2011/', ann_file='train_labels.json', test_mode=False, pipeline=None) ic13_rec_data_root = 'data/rec/icdar_2013/' ic13_rec_train = dict( type='OCRDataset', data_root='data/rec/icdar_2013/', ann_file='train_labels.json', test_mode=False, pipeline=None) ic13_rec_test = dict( type='OCRDataset', data_root='data/rec/icdar_2013/', ann_file='test_labels.json', test_mode=True, pipeline=None) ic15_rec_data_root = 'data/rec/icdar_2015/' ic15_rec_train = dict( type='OCRDataset', data_root='data/rec/icdar_2015/', ann_file='train_labels.json', test_mode=False, pipeline=None) ic15_rec_test = dict( type='OCRDataset', data_root='data/rec/icdar_2015/', ann_file='test_labels.json', test_mode=True, pipeline=None) default_scope = 'mmocr' env_cfg = dict( cudnn_benchmark=True, mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0), dist_cfg=dict(backend='nccl')) randomness = dict(seed=None) default_hooks = dict( timer=dict(type='IterTimerHook'), logger=dict(type='LoggerHook', interval=100), param_scheduler=dict(type='ParamSchedulerHook'), checkpoint=dict( type='CheckpointHook', interval=1, out_dir='sproject:s3://1.0.0rc0_recog_retest'), sampler_seed=dict(type='DistSamplerSeedHook'), sync_buffer=dict(type='SyncBuffersHook'), visualization=dict( type='VisualizationHook', interval=1, enable=False, show=False, draw_gt=False, draw_pred=False)) log_level = 'INFO' log_processor = dict(type='LogProcessor', window_size=10, by_epoch=True) load_from = None resume = False val_evaluator = dict( type='MultiDatasetsEvaluator', metrics=[dict(type='WordMetric', mode=['ignore_case_symbol'])], dataset_prefixes=['CUTE80', 'IIIT5K', 'SVT', 'SVTP', 'IC13', 'IC15']) test_evaluator = dict( type='MultiDatasetsEvaluator', metrics=[dict(type='WordMetric', mode=['ignore_case_symbol'])], dataset_prefixes=['CUTE80', 'IIIT5K', 'SVT', 'SVTP', 'IC13', 'IC15']) vis_backends = [dict(type='LocalVisBackend')] visualizer = dict( type='TextRecogLocalVisualizer', name='visualizer', vis_backends=[dict(type='LocalVisBackend')]) optim_wrapper = dict( type='OptimWrapper', optimizer=dict(type='Adam', lr=0.001)) train_cfg = dict(type='EpochBasedTrainLoop', max_epochs=5, val_interval=1) val_cfg = dict(type='ValLoop') test_cfg = dict(type='TestLoop') param_scheduler = [dict(type='MultiStepLR', milestones=[3, 4], end=5)] dictionary = dict( type='Dictionary', dict_file= 'configs/textrecog/sar/../../../dicts/english_digits_symbols.txt', with_start=True, with_end=True, same_start_end=True, with_padding=True, with_unknown=True) model = dict( type='SARNet', data_preprocessor=dict( type='TextRecogDataPreprocessor', mean=[127, 127, 127], std=[127, 127, 127]), backbone=dict(type='ResNet31OCR'), encoder=dict( type='SAREncoder', enc_bi_rnn=False, enc_do_rnn=0.1, enc_gru=False), decoder=dict( type='SequentialSARDecoder', enc_bi_rnn=False, dec_bi_rnn=False, dec_do_rnn=0, dec_gru=False, pred_dropout=0.1, d_k=512, pred_concat=True, postprocessor=dict(type='AttentionPostprocessor'), module_loss=dict( type='CEModuleLoss', ignore_first_char=True, reduction='mean'), dictionary=dict( type='Dictionary', dict_file= 'configs/textrecog/sar/../../../dicts/english_digits_symbols.txt', with_start=True, with_end=True, same_start_end=True, with_padding=True, with_unknown=True), max_seq_len=30)) file_client_args = dict(backend='disk') train_pipeline = [ dict( type='LoadImageFromFile', file_client_args=dict(backend='disk'), ignore_empty=True, min_size=2), dict(type='LoadOCRAnnotations', with_text=True), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ] test_pipeline = [ dict(type='LoadImageFromFile', file_client_args=dict(backend='disk')), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict(type='LoadOCRAnnotations', with_text=True), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ] train_list = [ dict( type='RepeatDataset', dataset=dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root='data/rec/icdar_2011/', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/icdar_2013/', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/icdar_2015/', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/coco_text_v1', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/IIIT5K/', ann_file='train_labels.json', test_mode=False, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='disk'), ignore_empty=True, min_size=2), dict(type='LoadOCRAnnotations', with_text=True), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ]), times=20), dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root='data/rec/Syn90k/', data_prefix=dict(img_path='mnt/ramdisk/max/90kDICT32px'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/SynthText/', data_prefix=dict( img_path='synthtext/SynthText_patch_horizontal'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root='data/rec/synthtext_add/', ann_file='train_labels.json', test_mode=False, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='disk'), ignore_empty=True, min_size=2), dict(type='LoadOCRAnnotations', with_text=True), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ]) ] test_list = [ dict( type='OCRDataset', data_root='data/rec/ct80/', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='data/rec/IIIT5K/', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='data/rec/svt/', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='data/rec/svtp/', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='data/rec/icdar_2013/', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='data/rec/icdar_2015/', ann_file='test_labels.json', test_mode=True, pipeline=None) ] train_dataloader = dict( batch_size=384, num_workers=24, persistent_workers=True, sampler=dict(type='DefaultSampler', shuffle=True), dataset=dict( type='ConcatDataset', datasets=[ dict( type='RepeatDataset', dataset=dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2011', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2013', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2015', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/coco_text_v1', ann_file='train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/IIIT5K', ann_file='train_labels.json', test_mode=False, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='petrel'), ignore_empty=True, min_size=2), dict(type='LoadOCRAnnotations', with_text=True), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ]), times=20), dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/Syn90k', data_prefix=dict( img_path='mnt/ramdisk/max/90kDICT32px'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/SynthText', data_prefix=dict( img_path='synthtext/SynthText_patch_horizontal'), ann_file='subset_train_labels.json', test_mode=False, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/synthtext_add', ann_file='train_labels.json', test_mode=False, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='petrel'), ignore_empty=True, min_size=2), dict(type='LoadOCRAnnotations', with_text=True), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ]) ], verify_meta=False)) test_dataloader = dict( batch_size=1, num_workers=4, persistent_workers=True, drop_last=False, sampler=dict(type='DefaultSampler', shuffle=False), dataset=dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/ct80', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/IIIT5K', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/svt', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/svtp', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2013', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2015', ann_file='test_labels.json', test_mode=True, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='petrel')), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict(type='LoadOCRAnnotations', with_text=True), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ])) val_dataloader = dict( batch_size=1, num_workers=4, persistent_workers=True, drop_last=False, sampler=dict(type='DefaultSampler', shuffle=False), dataset=dict( type='ConcatDataset', datasets=[ dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/ct80', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/IIIT5K', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/svt', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root='openmmlab:s3://openmmlab/datasets/ocr/recog/svtp', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2013', ann_file='test_labels.json', test_mode=True, pipeline=None), dict( type='OCRDataset', data_root= 'openmmlab:s3://openmmlab/datasets/ocr/recog/icdar_2015', ann_file='test_labels.json', test_mode=True, pipeline=None) ], pipeline=[ dict( type='LoadImageFromFile', file_client_args=dict(backend='petrel')), dict( type='RescaleToHeight', height=48, min_width=48, max_width=160, width_divisor=4), dict(type='PadToWidth', width=160), dict(type='LoadOCRAnnotations', with_text=True), dict( type='PackTextRecogInputs', meta_keys=('img_path', 'ori_shape', 'img_shape', 'valid_ratio')) ])) launcher = 'slurm' work_dir = './work_dirs/sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real' Name of parameter - Initialization information backbone.conv1_1.weight - torch.Size([64, 3, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv1_1.bias - torch.Size([64]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn1_1.weight - torch.Size([64]): UniformInit: a=0, b=1, bias=0 backbone.bn1_1.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of SARNet backbone.conv1_2.weight - torch.Size([128, 64, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv1_2.bias - torch.Size([128]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn1_2.weight - torch.Size([128]): UniformInit: a=0, b=1, bias=0 backbone.bn1_2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of SARNet backbone.block2.0.conv1.weight - torch.Size([256, 128, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block2.0.conv2.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block2.0.bn1.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block2.0.bn1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block2.0.bn2.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block2.0.bn2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block2.0.downsample.0.weight - torch.Size([256, 128, 1, 1]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block2.0.downsample.1.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block2.0.downsample.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.conv2.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv2.bias - torch.Size([256]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn2.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.bn2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block3.0.conv1.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block3.0.conv2.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block3.0.bn1.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block3.0.bn1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block3.0.bn2.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block3.0.bn2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block3.1.conv1.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block3.1.conv2.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block3.1.bn1.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block3.1.bn1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block3.1.bn2.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.block3.1.bn2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.conv3.weight - torch.Size([256, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv3.bias - torch.Size([256]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn3.weight - torch.Size([256]): UniformInit: a=0, b=1, bias=0 backbone.bn3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.0.conv1.weight - torch.Size([512, 256, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.0.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.0.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.0.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.0.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.0.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.0.downsample.0.weight - torch.Size([512, 256, 1, 1]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.0.downsample.1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.0.downsample.1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.1.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.1.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.1.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.1.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.1.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.1.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.2.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.2.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.2.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.2.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.2.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.2.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.3.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.3.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.3.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.3.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.3.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.3.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.4.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.4.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block4.4.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.4.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block4.4.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block4.4.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.conv4.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv4.bias - torch.Size([512]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn4.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.bn4.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.0.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.0.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.0.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.0.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.0.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.0.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.1.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.1.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.1.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.1.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.1.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.1.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.2.conv1.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.2.conv2.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.block5.2.bn1.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.2.bn1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.block5.2.bn2.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.block5.2.bn2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet backbone.conv5.weight - torch.Size([512, 512, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.conv5.bias - torch.Size([512]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.bn5.weight - torch.Size([512]): UniformInit: a=0, b=1, bias=0 backbone.bn5.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.weight_ih_l0 - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.weight_hh_l0 - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.bias_ih_l0 - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.bias_hh_l0 - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.weight_ih_l1 - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.weight_hh_l1 - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.bias_ih_l1 - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet encoder.rnn_encoder.bias_hh_l1 - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet encoder.linear.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of SARNet encoder.linear.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet decoder.conv1x1_1.weight - torch.Size([512, 512, 1, 1]): The value is the same before and after calling `init_weights` of SARNet decoder.conv1x1_1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet decoder.conv3x3_1.weight - torch.Size([512, 512, 3, 3]): The value is the same before and after calling `init_weights` of SARNet decoder.conv3x3_1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SARNet decoder.conv1x1_2.weight - torch.Size([1, 512, 1, 1]): The value is the same before and after calling `init_weights` of SARNet decoder.conv1x1_2.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer1.weight_ih - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer1.weight_hh - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer1.bias_ih - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer1.bias_hh - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer2.weight_ih - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer2.weight_hh - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer2.bias_ih - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet decoder.rnn_decoder_layer2.bias_hh - torch.Size([2048]): The value is the same before and after calling `init_weights` of SARNet decoder.embedding.weight - torch.Size([93, 512]): The value is the same before and after calling `init_weights` of SARNet decoder.prediction.weight - torch.Size([93, 1536]): The value is the same before and after calling `init_weights` of SARNet decoder.prediction.bias - torch.Size([93]): The value is the same before and after calling `init_weights` of SARNet 2022/09/15 18:57:40 - mmengine - INFO - Checkpoints will be saved to sproject:s3://1.0.0rc0_recog_retest/sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real by PetrelBackend. 2022/09/15 19:23:28 - mmengine - INFO - Epoch(train) [1][100/2304] lr: 1.0000e-03 eta: 2 days, 1:06:26 time: 1.8942 data_time: 0.2622 memory: 42889 loss_ce: 2.5365 loss: 2.5365 2022/09/15 19:26:26 - mmengine - INFO - Epoch(train) [1][200/2304] lr: 1.0000e-03 eta: 1 day, 3:07:36 time: 2.1585 data_time: 0.2534 memory: 28222 loss_ce: 2.1335 loss: 2.1335 2022/09/15 19:29:21 - mmengine - INFO - Epoch(train) [1][300/2304] lr: 1.0000e-03 eta: 19:44:38 time: 2.1622 data_time: 0.0641 memory: 28222 loss_ce: 1.3505 loss: 1.3505 2022/09/15 19:32:10 - mmengine - INFO - Epoch(train) [1][400/2304] lr: 1.0000e-03 eta: 15:58:55 time: 1.4262 data_time: 0.0683 memory: 28222 loss_ce: 0.9147 loss: 0.9147 2022/09/15 19:35:04 - mmengine - INFO - Epoch(train) [1][500/2304] lr: 1.0000e-03 eta: 13:44:10 time: 1.3790 data_time: 0.0446 memory: 28222 loss_ce: 0.7671 loss: 0.7671 2022/09/15 19:37:58 - mmengine - INFO - Epoch(train) [1][600/2304] lr: 1.0000e-03 eta: 12:13:24 time: 1.3041 data_time: 0.0066 memory: 28222 loss_ce: 0.6736 loss: 0.6736 2022/09/15 19:40:57 - mmengine - INFO - Epoch(train) [1][700/2304] lr: 1.0000e-03 eta: 11:09:00 time: 1.9162 data_time: 0.2963 memory: 28222 loss_ce: 0.6446 loss: 0.6446 2022/09/15 19:43:55 - mmengine - INFO - Epoch(train) [1][800/2304] lr: 1.0000e-03 eta: 10:19:35 time: 2.2317 data_time: 0.2736 memory: 28222 loss_ce: 0.6031 loss: 0.6031 2022/09/15 19:46:50 - mmengine - INFO - Epoch(train) [1][900/2304] lr: 1.0000e-03 eta: 9:40:03 time: 2.1574 data_time: 0.0657 memory: 28222 loss_ce: 0.5774 loss: 0.5774 2022/09/15 19:49:39 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 19:49:39 - mmengine - INFO - Epoch(train) [1][1000/2304] lr: 1.0000e-03 eta: 9:06:45 time: 1.4156 data_time: 0.0625 memory: 28222 loss_ce: 0.5676 loss: 0.5676 2022/09/15 19:52:35 - mmengine - INFO - Epoch(train) [1][1100/2304] lr: 1.0000e-03 eta: 8:40:10 time: 1.3637 data_time: 0.0448 memory: 28222 loss_ce: 0.5458 loss: 0.5458 2022/09/15 19:55:30 - mmengine - INFO - Epoch(train) [1][1200/2304] lr: 1.0000e-03 eta: 8:17:14 time: 1.3495 data_time: 0.0069 memory: 28222 loss_ce: 0.5370 loss: 0.5370 2022/09/15 19:58:28 - mmengine - INFO - Epoch(train) [1][1300/2304] lr: 1.0000e-03 eta: 7:57:56 time: 1.9429 data_time: 0.2836 memory: 28222 loss_ce: 0.5291 loss: 0.5291 2022/09/15 20:01:25 - mmengine - INFO - Epoch(train) [1][1400/2304] lr: 1.0000e-03 eta: 7:40:44 time: 2.1037 data_time: 0.2604 memory: 28222 loss_ce: 0.5151 loss: 0.5151 2022/09/15 20:04:19 - mmengine - INFO - Epoch(train) [1][1500/2304] lr: 1.0000e-03 eta: 7:25:09 time: 2.0674 data_time: 0.0828 memory: 28222 loss_ce: 0.5000 loss: 0.5000 2022/09/15 20:07:09 - mmengine - INFO - Epoch(train) [1][1600/2304] lr: 1.0000e-03 eta: 7:10:44 time: 1.4402 data_time: 0.0433 memory: 28222 loss_ce: 0.4860 loss: 0.4860 2022/09/15 20:10:02 - mmengine - INFO - Epoch(train) [1][1700/2304] lr: 1.0000e-03 eta: 6:58:01 time: 1.4054 data_time: 0.0447 memory: 28222 loss_ce: 0.5103 loss: 0.5103 2022/09/15 20:12:56 - mmengine - INFO - Epoch(train) [1][1800/2304] lr: 1.0000e-03 eta: 6:46:24 time: 1.3537 data_time: 0.0081 memory: 28222 loss_ce: 0.4862 loss: 0.4862 2022/09/15 20:15:55 - mmengine - INFO - Epoch(train) [1][1900/2304] lr: 1.0000e-03 eta: 6:36:10 time: 1.9138 data_time: 0.2494 memory: 28222 loss_ce: 0.4626 loss: 0.4626 2022/09/15 20:18:51 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 20:18:51 - mmengine - INFO - Epoch(train) [1][2000/2304] lr: 1.0000e-03 eta: 6:26:24 time: 2.1842 data_time: 0.2693 memory: 28222 loss_ce: 0.4527 loss: 0.4527 2022/09/15 20:21:44 - mmengine - INFO - Epoch(train) [1][2100/2304] lr: 1.0000e-03 eta: 6:17:05 time: 1.9455 data_time: 0.0448 memory: 28222 loss_ce: 0.4625 loss: 0.4625 2022/09/15 20:24:34 - mmengine - INFO - Epoch(train) [1][2200/2304] lr: 1.0000e-03 eta: 6:08:06 time: 1.4178 data_time: 0.0449 memory: 28222 loss_ce: 0.4392 loss: 0.4392 2022/09/15 20:27:22 - mmengine - INFO - Epoch(train) [1][2300/2304] lr: 1.0000e-03 eta: 5:59:34 time: 1.3271 data_time: 0.0332 memory: 28222 loss_ce: 0.4405 loss: 0.4405 2022/09/15 20:27:39 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 20:27:39 - mmengine - INFO - Saving checkpoint at 1 epochs 2022/09/15 20:32:45 - mmengine - INFO - Epoch(val) [1][100/959] eta: 0:00:42 time: 0.0500 data_time: 0.0015 memory: 37551 2022/09/15 20:32:50 - mmengine - INFO - Epoch(val) [1][200/959] eta: 0:00:36 time: 0.0475 data_time: 0.0014 memory: 1132 2022/09/15 20:32:55 - mmengine - INFO - Epoch(val) [1][300/959] eta: 0:00:33 time: 0.0515 data_time: 0.0019 memory: 1132 2022/09/15 20:33:00 - mmengine - INFO - Epoch(val) [1][400/959] eta: 0:00:27 time: 0.0484 data_time: 0.0010 memory: 1132 2022/09/15 20:33:06 - mmengine - INFO - Epoch(val) [1][500/959] eta: 0:00:21 time: 0.0464 data_time: 0.0012 memory: 1132 2022/09/15 20:33:10 - mmengine - INFO - Epoch(val) [1][600/959] eta: 0:00:17 time: 0.0485 data_time: 0.0016 memory: 1132 2022/09/15 20:33:16 - mmengine - INFO - Epoch(val) [1][700/959] eta: 0:00:11 time: 0.0441 data_time: 0.0017 memory: 1132 2022/09/15 20:33:21 - mmengine - INFO - Epoch(val) [1][800/959] eta: 0:00:08 time: 0.0538 data_time: 0.0016 memory: 1132 2022/09/15 20:33:26 - mmengine - INFO - Epoch(val) [1][900/959] eta: 0:00:02 time: 0.0446 data_time: 0.0022 memory: 1132 2022/09/15 20:33:30 - mmengine - INFO - Epoch(val) [1][959/959] CUTE80/recog/word_acc_ignore_case_symbol: 0.8299 IIIT5K/recog/word_acc_ignore_case_symbol: 0.9250 SVT/recog/word_acc_ignore_case_symbol: 0.8377 SVTP/recog/word_acc_ignore_case_symbol: 0.7550 IC13/recog/word_acc_ignore_case_symbol: 0.8966 IC15/recog/word_acc_ignore_case_symbol: 0.7208 2022/09/15 20:36:28 - mmengine - INFO - Epoch(train) [2][100/2304] lr: 1.0000e-03 eta: 5:51:23 time: 1.9399 data_time: 0.2434 memory: 28222 loss_ce: 0.4315 loss: 0.4315 2022/09/15 20:39:14 - mmengine - INFO - Epoch(train) [2][200/2304] lr: 1.0000e-03 eta: 5:43:38 time: 1.8847 data_time: 0.2407 memory: 28222 loss_ce: 0.4434 loss: 0.4434 2022/09/15 20:42:03 - mmengine - INFO - Epoch(train) [2][300/2304] lr: 1.0000e-03 eta: 5:36:23 time: 1.7150 data_time: 0.1189 memory: 28222 loss_ce: 0.4104 loss: 0.4104 2022/09/15 20:44:51 - mmengine - INFO - Epoch(train) [2][400/2304] lr: 1.0000e-03 eta: 5:29:25 time: 1.5468 data_time: 0.0805 memory: 28222 loss_ce: 0.4153 loss: 0.4153 2022/09/15 20:47:37 - mmengine - INFO - Epoch(train) [2][500/2304] lr: 1.0000e-03 eta: 5:22:41 time: 1.4222 data_time: 0.0063 memory: 28222 loss_ce: 0.4372 loss: 0.4372 2022/09/15 20:50:25 - mmengine - INFO - Epoch(train) [2][600/2304] lr: 1.0000e-03 eta: 5:16:18 time: 1.4379 data_time: 0.0188 memory: 28222 loss_ce: 0.4076 loss: 0.4076 2022/09/15 20:53:07 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 20:53:18 - mmengine - INFO - Epoch(train) [2][700/2304] lr: 1.0000e-03 eta: 5:10:26 time: 1.9631 data_time: 0.2178 memory: 28222 loss_ce: 0.4143 loss: 0.4143 2022/09/15 20:56:07 - mmengine - INFO - Epoch(train) [2][800/2304] lr: 1.0000e-03 eta: 5:04:32 time: 1.8819 data_time: 0.2390 memory: 28222 loss_ce: 0.4052 loss: 0.4052 2022/09/15 20:58:56 - mmengine - INFO - Epoch(train) [2][900/2304] lr: 1.0000e-03 eta: 4:58:50 time: 1.6964 data_time: 0.1077 memory: 28222 loss_ce: 0.4082 loss: 0.4082 2022/09/15 21:01:43 - mmengine - INFO - Epoch(train) [2][1000/2304] lr: 1.0000e-03 eta: 4:53:14 time: 1.4850 data_time: 0.0488 memory: 28222 loss_ce: 0.3999 loss: 0.3999 2022/09/15 21:04:32 - mmengine - INFO - Epoch(train) [2][1100/2304] lr: 1.0000e-03 eta: 4:47:52 time: 1.4466 data_time: 0.0066 memory: 28222 loss_ce: 0.3870 loss: 0.3870 2022/09/15 21:07:20 - mmengine - INFO - Epoch(train) [2][1200/2304] lr: 1.0000e-03 eta: 4:42:35 time: 1.4350 data_time: 0.0200 memory: 28222 loss_ce: 0.3934 loss: 0.3934 2022/09/15 21:10:16 - mmengine - INFO - Epoch(train) [2][1300/2304] lr: 1.0000e-03 eta: 4:37:45 time: 2.0150 data_time: 0.3031 memory: 28222 loss_ce: 0.3869 loss: 0.3869 2022/09/15 21:13:04 - mmengine - INFO - Epoch(train) [2][1400/2304] lr: 1.0000e-03 eta: 4:32:46 time: 1.9320 data_time: 0.2509 memory: 28222 loss_ce: 0.3956 loss: 0.3956 2022/09/15 21:15:53 - mmengine - INFO - Epoch(train) [2][1500/2304] lr: 1.0000e-03 eta: 4:27:54 time: 1.7211 data_time: 0.1047 memory: 28222 loss_ce: 0.3795 loss: 0.3795 2022/09/15 21:18:42 - mmengine - INFO - Epoch(train) [2][1600/2304] lr: 1.0000e-03 eta: 4:23:09 time: 1.5117 data_time: 0.0804 memory: 28222 loss_ce: 0.3793 loss: 0.3793 2022/09/15 21:21:23 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 21:21:29 - mmengine - INFO - Epoch(train) [2][1700/2304] lr: 1.0000e-03 eta: 4:18:27 time: 1.4291 data_time: 0.0134 memory: 28222 loss_ce: 0.3745 loss: 0.3745 2022/09/15 21:24:16 - mmengine - INFO - Epoch(train) [2][1800/2304] lr: 1.0000e-03 eta: 4:13:49 time: 1.4571 data_time: 0.0224 memory: 28222 loss_ce: 0.3863 loss: 0.3863 2022/09/15 21:27:10 - mmengine - INFO - Epoch(train) [2][1900/2304] lr: 1.0000e-03 eta: 4:09:28 time: 1.9539 data_time: 0.2462 memory: 28222 loss_ce: 0.3849 loss: 0.3849 2022/09/15 21:29:56 - mmengine - INFO - Epoch(train) [2][2000/2304] lr: 1.0000e-03 eta: 4:05:00 time: 1.8844 data_time: 0.2415 memory: 28222 loss_ce: 0.3841 loss: 0.3841 2022/09/15 21:32:46 - mmengine - INFO - Epoch(train) [2][2100/2304] lr: 1.0000e-03 eta: 4:00:41 time: 1.7356 data_time: 0.1685 memory: 28222 loss_ce: 0.3802 loss: 0.3802 2022/09/15 21:35:34 - mmengine - INFO - Epoch(train) [2][2200/2304] lr: 1.0000e-03 eta: 3:56:24 time: 1.5091 data_time: 0.0423 memory: 28222 loss_ce: 0.3864 loss: 0.3864 2022/09/15 21:38:17 - mmengine - INFO - Epoch(train) [2][2300/2304] lr: 1.0000e-03 eta: 3:52:03 time: 1.3761 data_time: 0.0088 memory: 28222 loss_ce: 0.3908 loss: 0.3908 2022/09/15 21:38:22 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 21:38:22 - mmengine - INFO - Saving checkpoint at 2 epochs 2022/09/15 21:38:55 - mmengine - INFO - Epoch(val) [2][100/959] eta: 0:00:46 time: 0.0536 data_time: 0.0028 memory: 28222 2022/09/15 21:38:59 - mmengine - INFO - Epoch(val) [2][200/959] eta: 0:00:36 time: 0.0481 data_time: 0.0021 memory: 1132 2022/09/15 21:39:05 - mmengine - INFO - Epoch(val) [2][300/959] eta: 0:00:32 time: 0.0499 data_time: 0.0010 memory: 1132 2022/09/15 21:39:09 - mmengine - INFO - Epoch(val) [2][400/959] eta: 0:00:28 time: 0.0505 data_time: 0.0012 memory: 1132 2022/09/15 21:39:14 - mmengine - INFO - Epoch(val) [2][500/959] eta: 0:00:24 time: 0.0543 data_time: 0.0018 memory: 1132 2022/09/15 21:39:19 - mmengine - INFO - Epoch(val) [2][600/959] eta: 0:00:19 time: 0.0532 data_time: 0.0019 memory: 1132 2022/09/15 21:39:24 - mmengine - INFO - Epoch(val) [2][700/959] eta: 0:00:12 time: 0.0484 data_time: 0.0013 memory: 1132 2022/09/15 21:39:29 - mmengine - INFO - Epoch(val) [2][800/959] eta: 0:00:07 time: 0.0473 data_time: 0.0016 memory: 1132 2022/09/15 21:39:35 - mmengine - INFO - Epoch(val) [2][900/959] eta: 0:00:03 time: 0.0536 data_time: 0.0019 memory: 1132 2022/09/15 21:39:37 - mmengine - INFO - Epoch(val) [2][959/959] CUTE80/recog/word_acc_ignore_case_symbol: 0.8750 IIIT5K/recog/word_acc_ignore_case_symbol: 0.9363 SVT/recog/word_acc_ignore_case_symbol: 0.8717 SVTP/recog/word_acc_ignore_case_symbol: 0.7953 IC13/recog/word_acc_ignore_case_symbol: 0.9261 IC15/recog/word_acc_ignore_case_symbol: 0.7429 2022/09/15 21:42:35 - mmengine - INFO - Epoch(train) [3][100/2304] lr: 1.0000e-03 eta: 3:47:47 time: 2.0079 data_time: 0.2932 memory: 28222 loss_ce: 0.3660 loss: 0.3660 2022/09/15 21:45:21 - mmengine - INFO - Epoch(train) [3][200/2304] lr: 1.0000e-03 eta: 3:43:39 time: 2.0989 data_time: 0.2850 memory: 28222 loss_ce: 0.3635 loss: 0.3635 2022/09/15 21:48:06 - mmengine - INFO - Epoch(train) [3][300/2304] lr: 1.0000e-03 eta: 3:39:32 time: 1.7631 data_time: 0.2350 memory: 28222 loss_ce: 0.3574 loss: 0.3574 2022/09/15 21:50:42 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 21:50:53 - mmengine - INFO - Epoch(train) [3][400/2304] lr: 1.0000e-03 eta: 3:35:31 time: 1.4126 data_time: 0.0231 memory: 28222 loss_ce: 0.3603 loss: 0.3603 2022/09/15 21:53:41 - mmengine - INFO - Epoch(train) [3][500/2304] lr: 1.0000e-03 eta: 3:31:33 time: 1.3456 data_time: 0.0184 memory: 28222 loss_ce: 0.3553 loss: 0.3553 2022/09/15 21:56:28 - mmengine - INFO - Epoch(train) [3][600/2304] lr: 1.0000e-03 eta: 3:27:39 time: 1.4021 data_time: 0.0064 memory: 28222 loss_ce: 0.3535 loss: 0.3535 2022/09/15 21:59:21 - mmengine - INFO - Epoch(train) [3][700/2304] lr: 1.0000e-03 eta: 3:23:52 time: 1.9332 data_time: 0.2676 memory: 28222 loss_ce: 0.3486 loss: 0.3486 2022/09/15 22:02:10 - mmengine - INFO - Epoch(train) [3][800/2304] lr: 1.0000e-03 eta: 3:20:04 time: 2.1358 data_time: 0.2976 memory: 28222 loss_ce: 0.3653 loss: 0.3653 2022/09/15 22:04:56 - mmengine - INFO - Epoch(train) [3][900/2304] lr: 1.0000e-03 eta: 3:16:15 time: 1.7477 data_time: 0.2276 memory: 28222 loss_ce: 0.3551 loss: 0.3551 2022/09/15 22:07:42 - mmengine - INFO - Epoch(train) [3][1000/2304] lr: 1.0000e-03 eta: 3:12:27 time: 1.3819 data_time: 0.0201 memory: 28222 loss_ce: 0.3471 loss: 0.3471 2022/09/15 22:10:30 - mmengine - INFO - Epoch(train) [3][1100/2304] lr: 1.0000e-03 eta: 3:08:44 time: 1.3674 data_time: 0.0194 memory: 28222 loss_ce: 0.3488 loss: 0.3488 2022/09/15 22:13:16 - mmengine - INFO - Epoch(train) [3][1200/2304] lr: 1.0000e-03 eta: 3:05:01 time: 1.3703 data_time: 0.0063 memory: 28222 loss_ce: 0.3565 loss: 0.3565 2022/09/15 22:16:09 - mmengine - INFO - Epoch(train) [3][1300/2304] lr: 1.0000e-03 eta: 3:01:26 time: 2.0055 data_time: 0.2512 memory: 28222 loss_ce: 0.3498 loss: 0.3498 2022/09/15 22:18:39 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 22:18:57 - mmengine - INFO - Epoch(train) [3][1400/2304] lr: 1.0000e-03 eta: 2:57:49 time: 2.0906 data_time: 0.2989 memory: 28222 loss_ce: 0.3492 loss: 0.3492 2022/09/15 22:21:44 - mmengine - INFO - Epoch(train) [3][1500/2304] lr: 1.0000e-03 eta: 2:54:11 time: 1.7268 data_time: 0.1930 memory: 28222 loss_ce: 0.3367 loss: 0.3367 2022/09/15 22:24:31 - mmengine - INFO - Epoch(train) [3][1600/2304] lr: 1.0000e-03 eta: 2:50:36 time: 1.3577 data_time: 0.0193 memory: 28222 loss_ce: 0.3422 loss: 0.3422 2022/09/15 22:27:20 - mmengine - INFO - Epoch(train) [3][1700/2304] lr: 1.0000e-03 eta: 2:47:03 time: 1.3429 data_time: 0.0185 memory: 28222 loss_ce: 0.3568 loss: 0.3568 2022/09/15 22:30:07 - mmengine - INFO - Epoch(train) [3][1800/2304] lr: 1.0000e-03 eta: 2:43:31 time: 1.3820 data_time: 0.0061 memory: 28222 loss_ce: 0.3442 loss: 0.3442 2022/09/15 22:33:00 - mmengine - INFO - Epoch(train) [3][1900/2304] lr: 1.0000e-03 eta: 2:40:05 time: 2.0424 data_time: 0.3049 memory: 28222 loss_ce: 0.3584 loss: 0.3584 2022/09/15 22:35:47 - mmengine - INFO - Epoch(train) [3][2000/2304] lr: 1.0000e-03 eta: 2:36:35 time: 2.1015 data_time: 0.3032 memory: 28222 loss_ce: 0.3524 loss: 0.3524 2022/09/15 22:38:35 - mmengine - INFO - Epoch(train) [3][2100/2304] lr: 1.0000e-03 eta: 2:33:06 time: 1.7963 data_time: 0.2097 memory: 28222 loss_ce: 0.3427 loss: 0.3427 2022/09/15 22:41:21 - mmengine - INFO - Epoch(train) [3][2200/2304] lr: 1.0000e-03 eta: 2:29:38 time: 1.4105 data_time: 0.0198 memory: 28222 loss_ce: 0.3363 loss: 0.3363 2022/09/15 22:44:06 - mmengine - INFO - Epoch(train) [3][2300/2304] lr: 1.0000e-03 eta: 2:26:10 time: 1.3151 data_time: 0.0200 memory: 28222 loss_ce: 0.3481 loss: 0.3481 2022/09/15 22:44:11 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 22:44:11 - mmengine - INFO - Saving checkpoint at 3 epochs 2022/09/15 22:44:45 - mmengine - INFO - Epoch(val) [3][100/959] eta: 0:00:41 time: 0.0484 data_time: 0.0010 memory: 28222 2022/09/15 22:44:50 - mmengine - INFO - Epoch(val) [3][200/959] eta: 0:00:39 time: 0.0522 data_time: 0.0028 memory: 1132 2022/09/15 22:44:55 - mmengine - INFO - Epoch(val) [3][300/959] eta: 0:00:33 time: 0.0503 data_time: 0.0016 memory: 1132 2022/09/15 22:45:00 - mmengine - INFO - Epoch(val) [3][400/959] eta: 0:00:26 time: 0.0466 data_time: 0.0016 memory: 1132 2022/09/15 22:45:06 - mmengine - INFO - Epoch(val) [3][500/959] eta: 0:00:24 time: 0.0531 data_time: 0.0010 memory: 1132 2022/09/15 22:45:11 - mmengine - INFO - Epoch(val) [3][600/959] eta: 0:00:16 time: 0.0461 data_time: 0.0009 memory: 1132 2022/09/15 22:45:16 - mmengine - INFO - Epoch(val) [3][700/959] eta: 0:00:14 time: 0.0558 data_time: 0.0024 memory: 1132 2022/09/15 22:45:21 - mmengine - INFO - Epoch(val) [3][800/959] eta: 0:00:09 time: 0.0577 data_time: 0.0010 memory: 1132 2022/09/15 22:45:25 - mmengine - INFO - Epoch(val) [3][900/959] eta: 0:00:01 time: 0.0246 data_time: 0.0005 memory: 1132 2022/09/15 22:45:27 - mmengine - INFO - Epoch(val) [3][959/959] CUTE80/recog/word_acc_ignore_case_symbol: 0.8715 IIIT5K/recog/word_acc_ignore_case_symbol: 0.9340 SVT/recog/word_acc_ignore_case_symbol: 0.8671 SVTP/recog/word_acc_ignore_case_symbol: 0.7814 IC13/recog/word_acc_ignore_case_symbol: 0.9192 IC15/recog/word_acc_ignore_case_symbol: 0.7448 2022/09/15 22:47:58 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 22:48:19 - mmengine - INFO - Epoch(train) [4][100/2304] lr: 1.0000e-04 eta: 2:22:36 time: 1.8842 data_time: 0.2933 memory: 28222 loss_ce: 0.3105 loss: 0.3105 2022/09/15 22:51:05 - mmengine - INFO - Epoch(train) [4][200/2304] lr: 1.0000e-04 eta: 2:19:11 time: 2.1119 data_time: 0.3362 memory: 28222 loss_ce: 0.3152 loss: 0.3152 2022/09/15 22:53:45 - mmengine - INFO - Epoch(train) [4][300/2304] lr: 1.0000e-04 eta: 2:15:45 time: 1.7550 data_time: 0.1053 memory: 28222 loss_ce: 0.3020 loss: 0.3020 2022/09/15 22:56:25 - mmengine - INFO - Epoch(train) [4][400/2304] lr: 1.0000e-04 eta: 2:12:19 time: 1.3911 data_time: 0.0287 memory: 28222 loss_ce: 0.3265 loss: 0.3265 2022/09/15 22:59:06 - mmengine - INFO - Epoch(train) [4][500/2304] lr: 1.0000e-04 eta: 2:08:55 time: 1.2912 data_time: 0.0061 memory: 28222 loss_ce: 0.2926 loss: 0.2926 2022/09/15 23:01:48 - mmengine - INFO - Epoch(train) [4][600/2304] lr: 1.0000e-04 eta: 2:05:32 time: 1.2985 data_time: 0.0063 memory: 28222 loss_ce: 0.3105 loss: 0.3105 2022/09/15 23:04:36 - mmengine - INFO - Epoch(train) [4][700/2304] lr: 1.0000e-04 eta: 2:02:14 time: 1.8828 data_time: 0.2669 memory: 28222 loss_ce: 0.3028 loss: 0.3028 2022/09/15 23:07:20 - mmengine - INFO - Epoch(train) [4][800/2304] lr: 1.0000e-04 eta: 1:58:55 time: 2.1089 data_time: 0.3139 memory: 28222 loss_ce: 0.3229 loss: 0.3229 2022/09/15 23:10:03 - mmengine - INFO - Epoch(train) [4][900/2304] lr: 1.0000e-04 eta: 1:55:36 time: 1.8148 data_time: 0.1528 memory: 28222 loss_ce: 0.3051 loss: 0.3051 2022/09/15 23:12:49 - mmengine - INFO - Epoch(train) [4][1000/2304] lr: 1.0000e-04 eta: 1:52:19 time: 1.4117 data_time: 0.0279 memory: 28222 loss_ce: 0.3029 loss: 0.3029 2022/09/15 23:15:16 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 23:15:33 - mmengine - INFO - Epoch(train) [4][1100/2304] lr: 1.0000e-04 eta: 1:49:02 time: 1.3500 data_time: 0.0064 memory: 28222 loss_ce: 0.3018 loss: 0.3018 2022/09/15 23:18:19 - mmengine - INFO - Epoch(train) [4][1200/2304] lr: 1.0000e-04 eta: 1:45:47 time: 1.3280 data_time: 0.0068 memory: 28222 loss_ce: 0.2977 loss: 0.2977 2022/09/15 23:21:08 - mmengine - INFO - Epoch(train) [4][1300/2304] lr: 1.0000e-04 eta: 1:42:34 time: 1.8821 data_time: 0.2514 memory: 28222 loss_ce: 0.3146 loss: 0.3146 2022/09/15 23:23:53 - mmengine - INFO - Epoch(train) [4][1400/2304] lr: 1.0000e-04 eta: 1:39:20 time: 2.1109 data_time: 0.2946 memory: 28222 loss_ce: 0.3080 loss: 0.3080 2022/09/15 23:26:36 - mmengine - INFO - Epoch(train) [4][1500/2304] lr: 1.0000e-04 eta: 1:36:06 time: 1.8465 data_time: 0.1808 memory: 28222 loss_ce: 0.3174 loss: 0.3174 2022/09/15 23:29:20 - mmengine - INFO - Epoch(train) [4][1600/2304] lr: 1.0000e-04 eta: 1:32:53 time: 1.3956 data_time: 0.0279 memory: 28222 loss_ce: 0.3003 loss: 0.3003 2022/09/15 23:32:04 - mmengine - INFO - Epoch(train) [4][1700/2304] lr: 1.0000e-04 eta: 1:29:40 time: 1.3266 data_time: 0.0068 memory: 28222 loss_ce: 0.3130 loss: 0.3130 2022/09/15 23:34:47 - mmengine - INFO - Epoch(train) [4][1800/2304] lr: 1.0000e-04 eta: 1:26:28 time: 1.3273 data_time: 0.0069 memory: 28222 loss_ce: 0.2984 loss: 0.2984 2022/09/15 23:37:36 - mmengine - INFO - Epoch(train) [4][1900/2304] lr: 1.0000e-04 eta: 1:23:18 time: 1.9270 data_time: 0.2518 memory: 28222 loss_ce: 0.3201 loss: 0.3201 2022/09/15 23:40:20 - mmengine - INFO - Epoch(train) [4][2000/2304] lr: 1.0000e-04 eta: 1:20:08 time: 2.1033 data_time: 0.2870 memory: 28222 loss_ce: 0.3106 loss: 0.3106 2022/09/15 23:42:40 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 23:43:04 - mmengine - INFO - Epoch(train) [4][2100/2304] lr: 1.0000e-04 eta: 1:16:58 time: 1.8593 data_time: 0.1350 memory: 28222 loss_ce: 0.3135 loss: 0.3135 2022/09/15 23:45:47 - mmengine - INFO - Epoch(train) [4][2200/2304] lr: 1.0000e-04 eta: 1:13:48 time: 1.3675 data_time: 0.0298 memory: 28222 loss_ce: 0.3019 loss: 0.3019 2022/09/15 23:48:26 - mmengine - INFO - Epoch(train) [4][2300/2304] lr: 1.0000e-04 eta: 1:10:38 time: 1.2948 data_time: 0.0063 memory: 28222 loss_ce: 0.3130 loss: 0.3130 2022/09/15 23:48:31 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/15 23:48:31 - mmengine - INFO - Saving checkpoint at 4 epochs 2022/09/15 23:49:04 - mmengine - INFO - Epoch(val) [4][100/959] eta: 0:00:39 time: 0.0461 data_time: 0.0016 memory: 28222 2022/09/15 23:49:09 - mmengine - INFO - Epoch(val) [4][200/959] eta: 0:00:40 time: 0.0532 data_time: 0.0021 memory: 1132 2022/09/15 23:49:15 - mmengine - INFO - Epoch(val) [4][300/959] eta: 0:00:38 time: 0.0580 data_time: 0.0023 memory: 1132 2022/09/15 23:49:20 - mmengine - INFO - Epoch(val) [4][400/959] eta: 0:00:28 time: 0.0517 data_time: 0.0022 memory: 1132 2022/09/15 23:49:25 - mmengine - INFO - Epoch(val) [4][500/959] eta: 0:00:21 time: 0.0472 data_time: 0.0019 memory: 1132 2022/09/15 23:49:30 - mmengine - INFO - Epoch(val) [4][600/959] eta: 0:00:18 time: 0.0511 data_time: 0.0011 memory: 1132 2022/09/15 23:49:35 - mmengine - INFO - Epoch(val) [4][700/959] eta: 0:00:12 time: 0.0474 data_time: 0.0008 memory: 1132 2022/09/15 23:49:40 - mmengine - INFO - Epoch(val) [4][800/959] eta: 0:00:08 time: 0.0562 data_time: 0.0033 memory: 1132 2022/09/15 23:49:45 - mmengine - INFO - Epoch(val) [4][900/959] eta: 0:00:02 time: 0.0504 data_time: 0.0011 memory: 1132 2022/09/15 23:49:47 - mmengine - INFO - Epoch(val) [4][959/959] CUTE80/recog/word_acc_ignore_case_symbol: 0.8924 IIIT5K/recog/word_acc_ignore_case_symbol: 0.9553 SVT/recog/word_acc_ignore_case_symbol: 0.8717 SVTP/recog/word_acc_ignore_case_symbol: 0.8093 IC13/recog/word_acc_ignore_case_symbol: 0.9409 IC15/recog/word_acc_ignore_case_symbol: 0.7737 2022/09/15 23:52:48 - mmengine - INFO - Epoch(train) [5][100/2304] lr: 1.0000e-05 eta: 1:07:25 time: 1.8550 data_time: 0.3145 memory: 28222 loss_ce: 0.3014 loss: 0.3014 2022/09/15 23:55:42 - mmengine - INFO - Epoch(train) [5][200/2304] lr: 1.0000e-05 eta: 1:04:19 time: 2.2139 data_time: 0.2965 memory: 28222 loss_ce: 0.3048 loss: 0.3048 2022/09/15 23:58:32 - mmengine - INFO - Epoch(train) [5][300/2304] lr: 1.0000e-05 eta: 1:01:13 time: 1.9277 data_time: 0.2588 memory: 28222 loss_ce: 0.3073 loss: 0.3073 2022/09/16 00:01:29 - mmengine - INFO - Epoch(train) [5][400/2304] lr: 1.0000e-05 eta: 0:58:08 time: 1.6606 data_time: 0.0344 memory: 28222 loss_ce: 0.2953 loss: 0.2953 2022/09/16 00:04:20 - mmengine - INFO - Epoch(train) [5][500/2304] lr: 1.0000e-05 eta: 0:55:03 time: 1.3436 data_time: 0.0312 memory: 28222 loss_ce: 0.2977 loss: 0.2977 2022/09/16 00:07:10 - mmengine - INFO - Epoch(train) [5][600/2304] lr: 1.0000e-05 eta: 0:51:57 time: 1.3100 data_time: 0.0492 memory: 28222 loss_ce: 0.3077 loss: 0.3077 2022/09/16 00:10:06 - mmengine - INFO - Epoch(train) [5][700/2304] lr: 1.0000e-05 eta: 0:48:53 time: 1.8420 data_time: 0.2849 memory: 28222 loss_ce: 0.2959 loss: 0.2959 2022/09/16 00:12:30 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/16 00:13:01 - mmengine - INFO - Epoch(train) [5][800/2304] lr: 1.0000e-05 eta: 0:45:49 time: 2.2682 data_time: 0.2469 memory: 28222 loss_ce: 0.2980 loss: 0.2980 2022/09/16 00:15:52 - mmengine - INFO - Epoch(train) [5][900/2304] lr: 1.0000e-05 eta: 0:42:45 time: 1.9060 data_time: 0.2269 memory: 28222 loss_ce: 0.3023 loss: 0.3023 2022/09/16 00:18:44 - mmengine - INFO - Epoch(train) [5][1000/2304] lr: 1.0000e-05 eta: 0:39:40 time: 1.6360 data_time: 0.0288 memory: 28222 loss_ce: 0.2848 loss: 0.2848 2022/09/16 00:21:35 - mmengine - INFO - Epoch(train) [5][1100/2304] lr: 1.0000e-05 eta: 0:36:37 time: 1.3434 data_time: 0.0507 memory: 28222 loss_ce: 0.2976 loss: 0.2976 2022/09/16 00:24:27 - mmengine - INFO - Epoch(train) [5][1200/2304] lr: 1.0000e-05 eta: 0:33:33 time: 1.3183 data_time: 0.0580 memory: 28222 loss_ce: 0.2930 loss: 0.2930 2022/09/16 00:27:23 - mmengine - INFO - Epoch(train) [5][1300/2304] lr: 1.0000e-05 eta: 0:30:30 time: 1.8353 data_time: 0.2888 memory: 28222 loss_ce: 0.3012 loss: 0.3012 2022/09/16 00:30:16 - mmengine - INFO - Epoch(train) [5][1400/2304] lr: 1.0000e-05 eta: 0:27:27 time: 2.1897 data_time: 0.2655 memory: 28222 loss_ce: 0.2861 loss: 0.2861 2022/09/16 00:33:10 - mmengine - INFO - Epoch(train) [5][1500/2304] lr: 1.0000e-05 eta: 0:24:24 time: 2.0365 data_time: 0.2670 memory: 28222 loss_ce: 0.2920 loss: 0.2920 2022/09/16 00:35:59 - mmengine - INFO - Epoch(train) [5][1600/2304] lr: 1.0000e-05 eta: 0:21:21 time: 1.6290 data_time: 0.0304 memory: 28222 loss_ce: 0.2943 loss: 0.2943 2022/09/16 00:38:50 - mmengine - INFO - Epoch(train) [5][1700/2304] lr: 1.0000e-05 eta: 0:18:18 time: 1.3346 data_time: 0.0449 memory: 28222 loss_ce: 0.3035 loss: 0.3035 2022/09/16 00:41:21 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/16 00:41:43 - mmengine - INFO - Epoch(train) [5][1800/2304] lr: 1.0000e-05 eta: 0:15:16 time: 1.2940 data_time: 0.0412 memory: 28222 loss_ce: 0.2935 loss: 0.2935 2022/09/16 00:44:39 - mmengine - INFO - Epoch(train) [5][1900/2304] lr: 1.0000e-05 eta: 0:12:14 time: 1.8625 data_time: 0.2918 memory: 28222 loss_ce: 0.3007 loss: 0.3007 2022/09/16 00:47:33 - mmengine - INFO - Epoch(train) [5][2000/2304] lr: 1.0000e-05 eta: 0:09:12 time: 2.2334 data_time: 0.2435 memory: 28222 loss_ce: 0.2850 loss: 0.2850 2022/09/16 00:50:26 - mmengine - INFO - Epoch(train) [5][2100/2304] lr: 1.0000e-05 eta: 0:06:10 time: 1.9851 data_time: 0.2231 memory: 28222 loss_ce: 0.3015 loss: 0.3015 2022/09/16 00:53:20 - mmengine - INFO - Epoch(train) [5][2200/2304] lr: 1.0000e-05 eta: 0:03:08 time: 1.6310 data_time: 0.0283 memory: 28222 loss_ce: 0.2905 loss: 0.2905 2022/09/16 00:56:08 - mmengine - INFO - Epoch(train) [5][2300/2304] lr: 1.0000e-05 eta: 0:00:07 time: 1.2892 data_time: 0.0298 memory: 28222 loss_ce: 0.3063 loss: 0.3063 2022/09/16 00:56:13 - mmengine - INFO - Exp name: sar_resnet31_sequential-decoder_5e_st-sub_mj-sub_sa_real_20220915_185451 2022/09/16 00:56:13 - mmengine - INFO - Saving checkpoint at 5 epochs 2022/09/16 00:56:46 - mmengine - INFO - Epoch(val) [5][100/959] eta: 0:00:45 time: 0.0529 data_time: 0.0014 memory: 28222 2022/09/16 00:56:52 - mmengine - INFO - Epoch(val) [5][200/959] eta: 0:00:36 time: 0.0480 data_time: 0.0045 memory: 1132 2022/09/16 00:56:57 - mmengine - INFO - Epoch(val) [5][300/959] eta: 0:00:34 time: 0.0519 data_time: 0.0025 memory: 1132 2022/09/16 00:57:02 - mmengine - INFO - Epoch(val) [5][400/959] eta: 0:00:31 time: 0.0555 data_time: 0.0018 memory: 1132 2022/09/16 00:57:07 - mmengine - INFO - Epoch(val) [5][500/959] eta: 0:00:23 time: 0.0507 data_time: 0.0019 memory: 1132 2022/09/16 00:57:12 - mmengine - INFO - Epoch(val) [5][600/959] eta: 0:00:17 time: 0.0487 data_time: 0.0016 memory: 1132 2022/09/16 00:57:17 - mmengine - INFO - Epoch(val) [5][700/959] eta: 0:00:13 time: 0.0511 data_time: 0.0010 memory: 1132 2022/09/16 00:57:22 - mmengine - INFO - Epoch(val) [5][800/959] eta: 0:00:06 time: 0.0437 data_time: 0.0017 memory: 1132 2022/09/16 00:57:27 - mmengine - INFO - Epoch(val) [5][900/959] eta: 0:00:01 time: 0.0276 data_time: 0.0005 memory: 1132 2022/09/16 00:57:29 - mmengine - INFO - Epoch(val) [5][959/959] CUTE80/recog/word_acc_ignore_case_symbol: 0.8924 IIIT5K/recog/word_acc_ignore_case_symbol: 0.9550 SVT/recog/word_acc_ignore_case_symbol: 0.8686 SVTP/recog/word_acc_ignore_case_symbol: 0.8016 IC13/recog/word_acc_ignore_case_symbol: 0.9409 IC15/recog/word_acc_ignore_case_symbol: 0.7727