2023-02-06 01:16:21,635 - mmrotate - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.8.5 (default, Sep 4 2020, 07:30:14) [GCC 7.3.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: GeForce RTX 3090 CUDA_HOME: /usr/local/cuda NVCC: Cuda compilation tools, release 11.0, V11.0.221 GCC: gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0 PyTorch: 1.7.1+cu110 PyTorch compiling details: PyTorch built with: - GCC 7.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v1.6.0 (Git Hash 5ef631a030a6f73131c77892041042805a06064f) - OpenMP 201511 (a.k.a. OpenMP 4.5) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.0 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80 - CuDNN 8.0.5 - Magma 2.5.2 - Build settings: BLAS=MKL, BUILD_TYPE=Release, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DUSE_VULKAN_WRAPPER -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, USE_CUDA=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, TorchVision: 0.8.2+cu110 OpenCV: 4.6.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.0 MMRotate: 0.3.3+54107e0 ------------------------------------------------------------ 2023-02-06 01:16:22,637 - mmrotate - INFO - Distributed training: True 2023-02-06 01:16:23,651 - mmrotate - INFO - Config: dataset_type = 'DOTADataset' data_root = '/opt/data/private/LYX/data/split_ms_dota/' img_norm_cfg = dict( mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict(type='RResize', img_scale=(1024, 1024)), dict( type='RRandomFlip', flip_ratio=[0.25, 0.25, 0.25], direction=['horizontal', 'vertical', 'diagonal'], version='le90'), dict( type='PolyRandomRotate', rotate_ratio=0.5, angles_range=180, auto_bound=False, rect_classes=[9, 11], version='le90'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1024, 1024), flip=False, transforms=[ dict(type='RResize'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='DOTADataset', ann_file= '/opt/data/private/LYX/data/split_ms_dota/trainval_2/annfiles/', img_prefix= '/opt/data/private/LYX/data/split_ms_dota/trainval_2/images/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict(type='RResize', img_scale=(1024, 1024)), dict( type='RRandomFlip', flip_ratio=[0.25, 0.25, 0.25], direction=['horizontal', 'vertical', 'diagonal'], version='le90'), dict( type='PolyRandomRotate', rotate_ratio=0.5, angles_range=180, auto_bound=False, rect_classes=[9, 11], version='le90'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels']) ], version='le90'), val=dict( type='DOTADataset', ann_file='/opt/data/private/LYX/data/split_ms_dota/val/annfiles/', img_prefix='/opt/data/private/LYX/data/split_ms_dota/val/images/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1024, 1024), flip=False, transforms=[ dict(type='RResize'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img']) ]) ], version='le90'), test=dict( type='DOTADataset', ann_file='/opt/data/private/LYX/data/split_ms_dota/test/images/', img_prefix='/opt/data/private/LYX/data/split_ms_dota/test/images/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1024, 1024), flip=False, transforms=[ dict(type='RResize'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img']) ]) ], version='le90')) evaluation = dict(interval=3, metric='mAP') optimizer = dict( type='AdamW', lr=0.0002, betas=(0.9, 0.999), weight_decay=0.05) optimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2)) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.3333333333333333, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1) log_config = dict(interval=500, hooks=[dict(type='TextLoggerHook')]) dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' angle_version = 'le90' gpu_number = 8 model = dict( type='OrientedRCNN', backbone=dict( type='LSKNet', embed_dims=[32, 64, 160, 256], drop_rate=0.1, drop_path_rate=0.1, depths=[3, 3, 5, 2], init_cfg=dict( type='Pretrained', checkpoint= '/opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar' ), norm_cfg=dict(type='SyncBN', requires_grad=True)), neck=dict( type='FPN', in_channels=[32, 64, 160, 256], out_channels=256, num_outs=5), rpn_head=dict( type='OrientedRPNHead', in_channels=256, feat_channels=256, version='le90', anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='MidpointOffsetCoder', angle_range='le90', target_means=[0.0, 0.0, 0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0, 0.5, 0.5]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict( type='SmoothL1Loss', beta=0.1111111111111111, loss_weight=1.0)), roi_head=dict( type='OrientedStandardRoIHead', bbox_roi_extractor=dict( type='RotatedSingleRoIExtractor', roi_layer=dict( type='RoIAlignRotated', out_size=7, sample_num=2, clockwise=True), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='RotatedShared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=15, bbox_coder=dict( type='DeltaXYWHAOBBoxCoder', angle_range='le90', norm_factor=None, edge_swap=True, proj_xy=True, target_means=(0.0, 0.0, 0.0, 0.0, 0.0), target_stds=(0.1, 0.1, 0.2, 0.2, 0.1)), reg_class_agnostic=True, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, gpu_assign_thr=1000, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=0, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=2000, nms=dict(type='nms', iou_threshold=0.8), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=False, iou_calculator=dict(type='RBboxOverlaps2D'), gpu_assign_thr=1000, ignore_iof_thr=-1), sampler=dict( type='RRandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=2000, max_per_img=2000, nms=dict(type='nms', iou_threshold=0.8), min_bbox_size=0), rcnn=dict( nms_pre=2000, min_bbox_size=0, score_thr=0.05, nms=dict(iou_thr=0.1), max_per_img=2000))) work_dir = './work_dirs/lskb0_dota' auto_resume = False gpu_ids = range(0, 8) 2023-02-06 01:16:23,652 - mmrotate - INFO - Set random seed to 0, deterministic: False 2023-02-06 01:16:23,926 - mmrotate - INFO - initialize LSKNet with init_cfg {'type': 'Pretrained', 'checkpoint': '/opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar'} 2023-02-06 01:16:25,031 - mmrotate - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2023-02-06 01:16:25,056 - mmrotate - INFO - initialize OrientedRPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2023-02-06 01:16:25,066 - mmrotate - INFO - initialize RotatedShared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'layer': 'Linear', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.patch_embed1.proj.weight - torch.Size([32, 3, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed1.proj.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed1.norm.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed1.norm.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.layer_scale_1 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.layer_scale_2 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.norm1.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.norm1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.norm2.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.norm2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.proj_1.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.proj_1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv0.weight - torch.Size([32, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv0.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([32, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv1.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv1.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv2.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv2.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv.weight - torch.Size([32, 16, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.spatial_gating_unit.conv.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.proj_2.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.attn.proj_2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.fc1.weight - torch.Size([256, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.fc1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.dwconv.dwconv.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.fc2.weight - torch.Size([32, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.0.mlp.fc2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.layer_scale_1 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.layer_scale_2 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.norm1.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.norm1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.norm2.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.norm2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.proj_1.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.proj_1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv0.weight - torch.Size([32, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv0.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([32, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv1.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv1.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv2.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv2.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv.weight - torch.Size([32, 16, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.spatial_gating_unit.conv.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.proj_2.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.attn.proj_2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.fc1.weight - torch.Size([256, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.fc1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.dwconv.dwconv.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.fc2.weight - torch.Size([32, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.1.mlp.fc2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.layer_scale_1 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.layer_scale_2 - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.norm1.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.norm1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.norm2.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.norm2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.proj_1.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.proj_1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv0.weight - torch.Size([32, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv0.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([32, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv1.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv1.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv2.weight - torch.Size([16, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv2.bias - torch.Size([16]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv.weight - torch.Size([32, 16, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.spatial_gating_unit.conv.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.proj_2.weight - torch.Size([32, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.attn.proj_2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.fc1.weight - torch.Size([256, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.fc1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.dwconv.dwconv.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.fc2.weight - torch.Size([32, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block1.2.mlp.fc2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm1.weight - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed2.proj.weight - torch.Size([64, 32, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed2.proj.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed2.norm.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed2.norm.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.layer_scale_1 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.layer_scale_2 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.norm1.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.norm1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.norm2.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.norm2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.proj_1.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.proj_1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv0.weight - torch.Size([64, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv0.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([64, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv1.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv2.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv.weight - torch.Size([64, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.spatial_gating_unit.conv.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.proj_2.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.attn.proj_2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.fc1.weight - torch.Size([512, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.fc1.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.dwconv.dwconv.weight - torch.Size([512, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.dwconv.dwconv.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.fc2.weight - torch.Size([64, 512, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.0.mlp.fc2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.layer_scale_1 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.layer_scale_2 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.norm1.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.norm1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.norm2.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.norm2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.proj_1.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.proj_1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv0.weight - torch.Size([64, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv0.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([64, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv1.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv2.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv.weight - torch.Size([64, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.spatial_gating_unit.conv.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.proj_2.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.attn.proj_2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.fc1.weight - torch.Size([512, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.fc1.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.dwconv.dwconv.weight - torch.Size([512, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.dwconv.dwconv.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.fc2.weight - torch.Size([64, 512, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.1.mlp.fc2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.layer_scale_1 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.layer_scale_2 - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.norm1.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.norm1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.norm2.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.norm2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.proj_1.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.proj_1.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv0.weight - torch.Size([64, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv0.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([64, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv1.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv1.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv2.weight - torch.Size([32, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv2.bias - torch.Size([32]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv.weight - torch.Size([64, 32, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.spatial_gating_unit.conv.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.proj_2.weight - torch.Size([64, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.attn.proj_2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.fc1.weight - torch.Size([512, 64, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.fc1.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.dwconv.dwconv.weight - torch.Size([512, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.dwconv.dwconv.bias - torch.Size([512]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.fc2.weight - torch.Size([64, 512, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block2.2.mlp.fc2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm2.weight - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm2.bias - torch.Size([64]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed3.proj.weight - torch.Size([160, 64, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed3.proj.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed3.norm.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed3.norm.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.layer_scale_1 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.layer_scale_2 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.norm1.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.norm1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.norm2.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.norm2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.proj_1.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.proj_1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv0.weight - torch.Size([160, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv0.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([160, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv1.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv1.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv2.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv2.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv.weight - torch.Size([160, 80, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.spatial_gating_unit.conv.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.proj_2.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.attn.proj_2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.fc1.weight - torch.Size([640, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.fc1.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.dwconv.dwconv.weight - torch.Size([640, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.dwconv.dwconv.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.fc2.weight - torch.Size([160, 640, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.0.mlp.fc2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.layer_scale_1 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.layer_scale_2 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.norm1.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.norm1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.norm2.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.norm2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.proj_1.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.proj_1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv0.weight - torch.Size([160, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv0.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([160, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv1.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv1.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv2.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv2.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv.weight - torch.Size([160, 80, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.spatial_gating_unit.conv.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.proj_2.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.attn.proj_2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.fc1.weight - torch.Size([640, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.fc1.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.dwconv.dwconv.weight - torch.Size([640, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.dwconv.dwconv.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.fc2.weight - torch.Size([160, 640, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.1.mlp.fc2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.layer_scale_1 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.layer_scale_2 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.norm1.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.norm1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.norm2.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.norm2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.proj_1.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.proj_1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv0.weight - torch.Size([160, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv0.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([160, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv1.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv1.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv2.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv2.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv.weight - torch.Size([160, 80, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.spatial_gating_unit.conv.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.proj_2.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.attn.proj_2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.fc1.weight - torch.Size([640, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.fc1.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.dwconv.dwconv.weight - torch.Size([640, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.dwconv.dwconv.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.fc2.weight - torch.Size([160, 640, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.2.mlp.fc2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.layer_scale_1 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.layer_scale_2 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.norm1.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.norm1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.norm2.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.norm2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.proj_1.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.proj_1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv0.weight - torch.Size([160, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv0.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([160, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv1.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv1.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv2.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv2.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv.weight - torch.Size([160, 80, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.spatial_gating_unit.conv.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.proj_2.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.attn.proj_2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.fc1.weight - torch.Size([640, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.fc1.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.dwconv.dwconv.weight - torch.Size([640, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.dwconv.dwconv.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.fc2.weight - torch.Size([160, 640, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.3.mlp.fc2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.layer_scale_1 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.layer_scale_2 - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.norm1.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.norm1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.norm2.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.norm2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.proj_1.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.proj_1.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv0.weight - torch.Size([160, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv0.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([160, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv1.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv1.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv2.weight - torch.Size([80, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv2.bias - torch.Size([80]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv.weight - torch.Size([160, 80, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.spatial_gating_unit.conv.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.proj_2.weight - torch.Size([160, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.attn.proj_2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.fc1.weight - torch.Size([640, 160, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.fc1.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.dwconv.dwconv.weight - torch.Size([640, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.dwconv.dwconv.bias - torch.Size([640]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.fc2.weight - torch.Size([160, 640, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block3.4.mlp.fc2.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm3.weight - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm3.bias - torch.Size([160]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed4.proj.weight - torch.Size([256, 160, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed4.proj.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed4.norm.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.patch_embed4.norm.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.layer_scale_1 - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.layer_scale_2 - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.norm1.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.norm1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.norm2.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.norm2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.proj_1.weight - torch.Size([256, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.proj_1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv0.weight - torch.Size([256, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv0.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([256, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv1.weight - torch.Size([128, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv1.bias - torch.Size([128]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv2.weight - torch.Size([128, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv2.bias - torch.Size([128]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv.weight - torch.Size([256, 128, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.spatial_gating_unit.conv.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.proj_2.weight - torch.Size([256, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.attn.proj_2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.fc1.weight - torch.Size([1024, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.fc1.bias - torch.Size([1024]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.dwconv.dwconv.weight - torch.Size([1024, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.dwconv.dwconv.bias - torch.Size([1024]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.fc2.weight - torch.Size([256, 1024, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.0.mlp.fc2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.layer_scale_1 - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.layer_scale_2 - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.norm1.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.norm1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.norm2.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.norm2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.proj_1.weight - torch.Size([256, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.proj_1.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv0.weight - torch.Size([256, 1, 5, 5]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv0.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv_spatial.weight - torch.Size([256, 1, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv_spatial.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv1.weight - torch.Size([128, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv1.bias - torch.Size([128]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv2.weight - torch.Size([128, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv2.bias - torch.Size([128]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv_squeeze.weight - torch.Size([2, 2, 7, 7]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv_squeeze.bias - torch.Size([2]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv.weight - torch.Size([256, 128, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.spatial_gating_unit.conv.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.proj_2.weight - torch.Size([256, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.attn.proj_2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.fc1.weight - torch.Size([1024, 256, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.fc1.bias - torch.Size([1024]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.dwconv.dwconv.weight - torch.Size([1024, 1, 3, 3]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.dwconv.dwconv.bias - torch.Size([1024]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.fc2.weight - torch.Size([256, 1024, 1, 1]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.block4.1.mlp.fc2.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm4.weight - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar backbone.norm4.bias - torch.Size([256]): PretrainedInit: load from /opt/data/private/VAN-Classification/output/train/20230201-002519-lsknet_b0-224/model_best.pth.tar neck.lateral_convs.0.conv.weight - torch.Size([256, 32, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 64, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 160, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 256, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of OrientedRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([18, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([18]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([16, 1024]): XavierInit: gain=1, distribution=normal, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([16]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([5, 1024]): XavierInit: gain=1, distribution=normal, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([5]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=normal, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=normal, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=normal, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=normal, bias=0 2023-02-06 01:41:21,406 - mmrotate - INFO - Start running, host: root@interactive79842, work_dir: /opt/data/private/LYX/selective-large-kernel/work_dirs/lskb0_dota 2023-02-06 01:41:21,407 - mmrotate - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) CheckpointHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) CheckpointHook (LOW ) IterTimerHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) CheckpointHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2023-02-06 01:41:21,408 - mmrotate - INFO - workflow: [('train', 1)], max: 12 epochs 2023-02-06 01:41:21,408 - mmrotate - INFO - Checkpoints will be saved to /opt/data/private/LYX/selective-large-kernel/work_dirs/lskb0_dota by HardDiskBackend. 2023-02-06 01:44:38,782 - mmrotate - INFO - Epoch [1][500/4271] lr: 1.997e-04, eta: 5:33:21, time: 0.394, data_time: 0.014, memory: 11549, loss_rpn_cls: 0.1511, loss_rpn_bbox: 0.1604, loss_cls: 0.2755, acc: 92.9317, loss_bbox: 0.2380, loss: 0.8249, grad_norm: 4.4691 2023-02-06 01:48:10,796 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 01:48:10,797 - mmrotate - INFO - Epoch [1][1000/4271] lr: 2.000e-04, eta: 5:42:33, time: 0.424, data_time: 0.007, memory: 11549, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.1021, loss_cls: 0.2199, acc: 92.3359, loss_bbox: 0.2240, loss: 0.6053, grad_norm: 2.4329 2023-02-06 01:51:18,165 - mmrotate - INFO - Epoch [1][1500/4271] lr: 2.000e-04, eta: 5:29:42, time: 0.375, data_time: 0.008, memory: 12316, loss_rpn_cls: 0.0447, loss_rpn_bbox: 0.0856, loss_cls: 0.1963, acc: 92.8421, loss_bbox: 0.1934, loss: 0.5200, grad_norm: 2.1958 2023-02-06 01:54:25,143 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 01:54:25,143 - mmrotate - INFO - Epoch [1][2000/4271] lr: 2.000e-04, eta: 5:21:32, time: 0.374, data_time: 0.008, memory: 12316, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0771, loss_cls: 0.1834, acc: 93.1576, loss_bbox: 0.1781, loss: 0.4785, grad_norm: 2.0781 2023-02-06 01:57:31,031 - mmrotate - INFO - Epoch [1][2500/4271] lr: 2.000e-04, eta: 5:15:01, time: 0.372, data_time: 0.008, memory: 12316, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0749, loss_cls: 0.1759, acc: 93.3367, loss_bbox: 0.1687, loss: 0.4542, grad_norm: 1.9232 2023-02-06 02:01:57,642 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 02:01:57,642 - mmrotate - INFO - Epoch [1][3000/4271] lr: 2.000e-04, eta: 5:31:18, time: 0.533, data_time: 0.008, memory: 12581, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0727, loss_cls: 0.1685, acc: 93.5990, loss_bbox: 0.1620, loss: 0.4361, grad_norm: 1.8993 2023-02-06 02:05:00,604 - mmrotate - INFO - Epoch [1][3500/4271] lr: 2.000e-04, eta: 5:22:37, time: 0.366, data_time: 0.008, memory: 12581, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0674, loss_cls: 0.1636, acc: 93.7593, loss_bbox: 0.1564, loss: 0.4166, grad_norm: 1.8002 2023-02-06 02:08:04,601 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 02:08:04,602 - mmrotate - INFO - Epoch [1][4000/4271] lr: 2.000e-04, eta: 5:15:34, time: 0.368, data_time: 0.008, memory: 12581, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0695, loss_cls: 0.1624, acc: 93.7719, loss_bbox: 0.1548, loss: 0.4160, grad_norm: 1.8003 2023-02-06 02:09:43,554 - mmrotate - INFO - Saving checkpoint at 1 epochs 2023-02-06 02:12:57,521 - mmrotate - INFO - Epoch [2][500/4271] lr: 2.000e-04, eta: 4:51:30, time: 0.385, data_time: 0.014, memory: 12587, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0644, loss_cls: 0.1537, acc: 94.0662, loss_bbox: 0.1475, loss: 0.3916, grad_norm: 1.6945 2023-02-06 02:17:47,710 - mmrotate - INFO - Epoch [2][1000/4271] lr: 2.000e-04, eta: 5:03:12, time: 0.580, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0651, loss_cls: 0.1542, acc: 94.0307, loss_bbox: 0.1466, loss: 0.3923, grad_norm: 1.6946 2023-02-06 02:20:53,354 - mmrotate - INFO - Epoch [2][1500/4271] lr: 2.000e-04, eta: 4:58:18, time: 0.371, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0612, loss_cls: 0.1497, acc: 94.1448, loss_bbox: 0.1441, loss: 0.3791, grad_norm: 1.6684 2023-02-06 02:25:47,400 - mmrotate - INFO - Epoch [2][2000/4271] lr: 2.000e-04, eta: 5:06:39, time: 0.588, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0613, loss_cls: 0.1504, acc: 94.1351, loss_bbox: 0.1417, loss: 0.3783, grad_norm: 1.6199 2023-02-06 02:28:51,691 - mmrotate - INFO - Epoch [2][2500/4271] lr: 2.000e-04, eta: 5:01:02, time: 0.369, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0564, loss_cls: 0.1431, acc: 94.3984, loss_bbox: 0.1371, loss: 0.3586, grad_norm: 1.5931 2023-02-06 02:32:15,049 - mmrotate - INFO - Epoch [2][3000/4271] lr: 2.000e-04, eta: 4:57:41, time: 0.407, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0598, loss_cls: 0.1468, acc: 94.2359, loss_bbox: 0.1389, loss: 0.3684, grad_norm: 1.5878 2023-02-06 02:35:18,525 - mmrotate - INFO - Epoch [2][3500/4271] lr: 2.000e-04, eta: 4:52:28, time: 0.367, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0553, loss_cls: 0.1436, acc: 94.3164, loss_bbox: 0.1343, loss: 0.3545, grad_norm: 1.5264 2023-02-06 02:38:23,309 - mmrotate - INFO - Epoch [2][4000/4271] lr: 2.000e-04, eta: 4:47:38, time: 0.369, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0538, loss_cls: 0.1383, acc: 94.5475, loss_bbox: 0.1315, loss: 0.3446, grad_norm: 1.5196 2023-02-06 02:40:03,012 - mmrotate - INFO - Saving checkpoint at 2 epochs 2023-02-06 02:43:18,292 - mmrotate - INFO - Epoch [3][500/4271] lr: 2.000e-04, eta: 4:33:26, time: 0.387, data_time: 0.014, memory: 12587, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0525, loss_cls: 0.1390, acc: 94.4994, loss_bbox: 0.1314, loss: 0.3429, grad_norm: 1.4738 2023-02-06 02:46:26,412 - mmrotate - INFO - Epoch [3][1000/4271] lr: 2.000e-04, eta: 4:29:45, time: 0.376, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0543, loss_cls: 0.1371, acc: 94.5776, loss_bbox: 0.1317, loss: 0.3431, grad_norm: 1.4861 2023-02-06 02:49:32,525 - mmrotate - INFO - Epoch [3][1500/4271] lr: 2.000e-04, eta: 4:25:58, time: 0.372, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0521, loss_cls: 0.1352, acc: 94.6224, loss_bbox: 0.1289, loss: 0.3341, grad_norm: 1.4440 2023-02-06 02:52:39,045 - mmrotate - INFO - Epoch [3][2000/4271] lr: 2.000e-04, eta: 4:22:17, time: 0.373, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0542, loss_cls: 0.1357, acc: 94.6034, loss_bbox: 0.1287, loss: 0.3378, grad_norm: 1.4975 2023-02-06 02:56:05,370 - mmrotate - INFO - Epoch [3][2500/4271] lr: 2.000e-04, eta: 4:19:51, time: 0.413, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0533, loss_cls: 0.1343, acc: 94.6114, loss_bbox: 0.1273, loss: 0.3331, grad_norm: 1.4566 2023-02-06 02:59:32,595 - mmrotate - INFO - Epoch [3][3000/4271] lr: 2.000e-04, eta: 4:17:23, time: 0.414, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0539, loss_cls: 0.1306, acc: 94.8004, loss_bbox: 0.1269, loss: 0.3296, grad_norm: 1.4604 2023-02-06 03:02:38,706 - mmrotate - INFO - Epoch [3][3500/4271] lr: 2.000e-04, eta: 4:13:42, time: 0.372, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0534, loss_cls: 0.1297, acc: 94.8434, loss_bbox: 0.1245, loss: 0.3248, grad_norm: 1.4336 2023-02-06 03:05:41,169 - mmrotate - INFO - Epoch [3][4000/4271] lr: 2.000e-04, eta: 4:09:52, time: 0.365, data_time: 0.008, memory: 12587, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0513, loss_cls: 0.1288, acc: 94.8555, loss_bbox: 0.1229, loss: 0.3203, grad_norm: 1.4083 2023-02-06 03:07:23,675 - mmrotate - INFO - Saving checkpoint at 3 epochs 2023-02-06 03:10:02,413 - mmrotate - INFO - +--------------------+-------+--------+--------+-------+ | class | gts | dets | recall | ap | +--------------------+-------+--------+--------+-------+ | plane | 23860 | 34601 | 0.942 | 0.900 | | baseball-diamond | 1727 | 4855 | 0.896 | 0.788 | | bridge | 4142 | 15560 | 0.719 | 0.554 | | ground-track-field | 1134 | 4950 | 0.956 | 0.835 | | small-vehicle | 33183 | 100586 | 0.871 | 0.685 | | large-vehicle | 29737 | 82312 | 0.937 | 0.837 | | ship | 80574 | 103521 | 0.909 | 0.884 | | tennis-court | 4389 | 9543 | 0.974 | 0.904 | | basketball-court | 1097 | 2779 | 0.927 | 0.817 | | storage-tank | 28751 | 30934 | 0.726 | 0.703 | | soccer-ball-field | 1242 | 4763 | 0.882 | 0.727 | | roundabout | 1536 | 3017 | 0.773 | 0.673 | | harbor | 15489 | 28685 | 0.851 | 0.750 | | swimming-pool | 3456 | 8003 | 0.820 | 0.680 | | helicopter | 765 | 3037 | 0.884 | 0.745 | +--------------------+-------+--------+--------+-------+ | mAP | | | | 0.766 | +--------------------+-------+--------+--------+-------+ 2023-02-06 03:10:02,555 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 03:10:02,555 - mmrotate - INFO - Epoch(val) [3][2034] mAP: 0.7655 2023-02-06 03:13:11,523 - mmrotate - INFO - Epoch [4][500/4271] lr: 2.000e-04, eta: 3:59:40, time: 0.378, data_time: 0.012, memory: 12587, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0513, loss_cls: 0.1297, acc: 94.7774, loss_bbox: 0.1225, loss: 0.3206, grad_norm: 1.4228 2023-02-06 03:16:19,779 - mmrotate - INFO - Epoch [4][1000/4271] lr: 2.000e-04, eta: 3:56:27, time: 0.377, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0514, loss_cls: 0.1251, acc: 94.9904, loss_bbox: 0.1211, loss: 0.3142, grad_norm: 1.4079 2023-02-06 03:19:24,317 - mmrotate - INFO - Epoch [4][1500/4271] lr: 2.000e-04, eta: 3:53:05, time: 0.369, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0499, loss_cls: 0.1252, acc: 94.9822, loss_bbox: 0.1209, loss: 0.3126, grad_norm: 1.3697 2023-02-06 03:22:27,931 - mmrotate - INFO - Epoch [4][2000/4271] lr: 2.000e-04, eta: 3:49:42, time: 0.367, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0484, loss_cls: 0.1239, acc: 95.0162, loss_bbox: 0.1191, loss: 0.3070, grad_norm: 1.4009 2023-02-06 03:25:33,666 - mmrotate - INFO - Epoch [4][2500/4271] lr: 2.000e-04, eta: 3:46:25, time: 0.371, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0493, loss_cls: 0.1244, acc: 94.9651, loss_bbox: 0.1190, loss: 0.3080, grad_norm: 1.3779 2023-02-06 03:28:40,219 - mmrotate - INFO - Epoch [4][3000/4271] lr: 2.000e-04, eta: 3:43:10, time: 0.373, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0508, loss_cls: 0.1232, acc: 95.0342, loss_bbox: 0.1192, loss: 0.3082, grad_norm: 1.3717 2023-02-06 03:31:46,172 - mmrotate - INFO - Epoch [4][3500/4271] lr: 2.000e-04, eta: 3:39:55, time: 0.372, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0474, loss_cls: 0.1204, acc: 95.1852, loss_bbox: 0.1161, loss: 0.2994, grad_norm: 1.3650 2023-02-06 03:36:35,942 - mmrotate - INFO - Epoch [4][4000/4271] lr: 2.000e-04, eta: 3:40:13, time: 0.580, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0473, loss_cls: 0.1195, acc: 95.1429, loss_bbox: 0.1184, loss: 0.3003, grad_norm: 1.3573 2023-02-06 03:38:36,167 - mmrotate - INFO - Saving checkpoint at 4 epochs 2023-02-06 03:41:49,156 - mmrotate - INFO - Epoch [5][500/4271] lr: 2.000e-04, eta: 3:31:57, time: 0.383, data_time: 0.013, memory: 12587, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0474, loss_cls: 0.1168, acc: 95.2729, loss_bbox: 0.1149, loss: 0.2926, grad_norm: 1.3220 2023-02-06 03:44:58,472 - mmrotate - INFO - Epoch [5][1000/4271] lr: 2.000e-04, eta: 3:28:49, time: 0.379, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0476, loss_cls: 0.1182, acc: 95.2089, loss_bbox: 0.1156, loss: 0.2959, grad_norm: 1.3594 2023-02-06 03:48:07,880 - mmrotate - INFO - Epoch [5][1500/4271] lr: 2.000e-04, eta: 3:25:41, time: 0.379, data_time: 0.007, memory: 12587, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0502, loss_cls: 0.1187, acc: 95.1927, loss_bbox: 0.1155, loss: 0.2995, grad_norm: 1.3677 2023-02-06 03:51:36,313 - mmrotate - INFO - Epoch [5][2000/4271] lr: 2.000e-04, eta: 3:23:05, time: 0.417, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0459, loss_cls: 0.1170, acc: 95.2582, loss_bbox: 0.1138, loss: 0.2908, grad_norm: 1.3121 2023-02-06 03:54:44,634 - mmrotate - INFO - Epoch [5][2500/4271] lr: 2.000e-04, eta: 3:19:54, time: 0.377, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0477, loss_cls: 0.1162, acc: 95.2910, loss_bbox: 0.1142, loss: 0.2922, grad_norm: 1.3463 2023-02-06 03:57:54,096 - mmrotate - INFO - Epoch [5][3000/4271] lr: 2.000e-04, eta: 3:16:45, time: 0.379, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0487, loss_cls: 0.1150, acc: 95.3308, loss_bbox: 0.1126, loss: 0.2902, grad_norm: 1.3436 2023-02-06 04:01:03,492 - mmrotate - INFO - Epoch [5][3500/4271] lr: 2.000e-04, eta: 3:13:35, time: 0.379, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0477, loss_cls: 0.1161, acc: 95.2946, loss_bbox: 0.1140, loss: 0.2923, grad_norm: 1.3227 2023-02-06 04:05:58,214 - mmrotate - INFO - Epoch [5][4000/4271] lr: 2.000e-04, eta: 3:12:57, time: 0.589, data_time: 0.008, memory: 12590, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0463, loss_cls: 0.1149, acc: 95.3410, loss_bbox: 0.1102, loss: 0.2860, grad_norm: 1.3239 2023-02-06 04:07:38,644 - mmrotate - INFO - Saving checkpoint at 5 epochs 2023-02-06 04:10:52,251 - mmrotate - INFO - Epoch [6][500/4271] lr: 2.000e-04, eta: 3:05:41, time: 0.384, data_time: 0.013, memory: 12590, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0459, loss_cls: 0.1135, acc: 95.3678, loss_bbox: 0.1117, loss: 0.2834, grad_norm: 1.2826 2023-02-06 04:14:00,154 - mmrotate - INFO - Epoch [6][1000/4271] lr: 2.000e-04, eta: 3:02:30, time: 0.376, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0452, loss_cls: 0.1136, acc: 95.3913, loss_bbox: 0.1127, loss: 0.2850, grad_norm: 1.3323 2023-02-06 04:18:40,786 - mmrotate - INFO - Epoch [6][1500/4271] lr: 2.000e-04, eta: 3:01:14, time: 0.561, data_time: 0.008, memory: 12590, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0460, loss_cls: 0.1116, acc: 95.4606, loss_bbox: 0.1090, loss: 0.2795, grad_norm: 1.3072 2023-02-06 04:22:12,280 - mmrotate - INFO - Epoch [6][2000/4271] lr: 2.000e-04, eta: 2:58:26, time: 0.423, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0434, loss_cls: 0.1116, acc: 95.4580, loss_bbox: 0.1104, loss: 0.2780, grad_norm: 1.2785 2023-02-06 04:25:19,190 - mmrotate - INFO - Epoch [6][2500/4271] lr: 2.000e-04, eta: 2:55:08, time: 0.374, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0472, loss_cls: 0.1131, acc: 95.4077, loss_bbox: 0.1110, loss: 0.2842, grad_norm: 1.3037 2023-02-06 04:28:26,031 - mmrotate - INFO - Epoch [6][3000/4271] lr: 2.000e-04, eta: 2:51:51, time: 0.374, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0461, loss_cls: 0.1130, acc: 95.4036, loss_bbox: 0.1114, loss: 0.2832, grad_norm: 1.2868 2023-02-06 04:31:34,241 - mmrotate - INFO - Epoch [6][3500/4271] lr: 2.000e-04, eta: 2:48:36, time: 0.376, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0464, loss_cls: 0.1090, acc: 95.5797, loss_bbox: 0.1092, loss: 0.2770, grad_norm: 1.2839 2023-02-06 04:34:42,176 - mmrotate - INFO - Epoch [6][4000/4271] lr: 2.000e-04, eta: 2:45:20, time: 0.376, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0447, loss_cls: 0.1102, acc: 95.4983, loss_bbox: 0.1091, loss: 0.2766, grad_norm: 1.2846 2023-02-06 04:36:23,710 - mmrotate - INFO - Saving checkpoint at 6 epochs 2023-02-06 04:38:48,200 - mmrotate - INFO - +--------------------+-------+-------+--------+-------+ | class | gts | dets | recall | ap | +--------------------+-------+-------+--------+-------+ | plane | 23860 | 30339 | 0.948 | 0.903 | | baseball-diamond | 1727 | 3908 | 0.916 | 0.864 | | bridge | 4142 | 9780 | 0.774 | 0.631 | | ground-track-field | 1134 | 7624 | 0.966 | 0.848 | | small-vehicle | 33183 | 68236 | 0.858 | 0.725 | | large-vehicle | 29737 | 72412 | 0.959 | 0.867 | | ship | 80574 | 97024 | 0.921 | 0.894 | | tennis-court | 4389 | 7496 | 0.981 | 0.908 | | basketball-court | 1097 | 2904 | 0.970 | 0.875 | | storage-tank | 28751 | 35887 | 0.721 | 0.704 | | soccer-ball-field | 1242 | 5128 | 0.904 | 0.804 | | roundabout | 1536 | 4200 | 0.835 | 0.775 | | harbor | 15489 | 22480 | 0.878 | 0.783 | | swimming-pool | 3456 | 8638 | 0.850 | 0.729 | | helicopter | 765 | 1658 | 0.899 | 0.797 | +--------------------+-------+-------+--------+-------+ | mAP | | | | 0.807 | +--------------------+-------+-------+--------+-------+ 2023-02-06 04:38:48,308 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 04:38:48,308 - mmrotate - INFO - Epoch(val) [6][2034] mAP: 0.8071 2023-02-06 04:41:58,652 - mmrotate - INFO - Epoch [7][500/4271] lr: 2.000e-04, eta: 2:38:44, time: 0.381, data_time: 0.013, memory: 12590, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0451, loss_cls: 0.1080, acc: 95.5938, loss_bbox: 0.1062, loss: 0.2717, grad_norm: 1.2868 2023-02-06 04:46:39,766 - mmrotate - INFO - Epoch [7][1000/4271] lr: 2.000e-04, eta: 2:36:59, time: 0.562, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0462, loss_cls: 0.1088, acc: 95.5621, loss_bbox: 0.1088, loss: 0.2756, grad_norm: 1.2646 2023-02-06 04:49:44,303 - mmrotate - INFO - Epoch [7][1500/4271] lr: 2.000e-04, eta: 2:33:42, time: 0.369, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0448, loss_cls: 0.1089, acc: 95.5650, loss_bbox: 0.1075, loss: 0.2738, grad_norm: 1.2659 2023-02-06 04:52:49,329 - mmrotate - INFO - Epoch [7][2000/4271] lr: 2.000e-04, eta: 2:30:26, time: 0.370, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0449, loss_cls: 0.1092, acc: 95.5591, loss_bbox: 0.1089, loss: 0.2752, grad_norm: 1.3034 2023-02-06 04:55:53,817 - mmrotate - INFO - Epoch [7][2500/4271] lr: 2.000e-04, eta: 2:27:09, time: 0.369, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0444, loss_cls: 0.1073, acc: 95.6342, loss_bbox: 0.1072, loss: 0.2712, grad_norm: 1.2820 2023-02-06 05:00:35,819 - mmrotate - INFO - Epoch [7][3000/4271] lr: 2.000e-04, eta: 2:25:10, time: 0.564, data_time: 0.007, memory: 12590, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0434, loss_cls: 0.1067, acc: 95.6334, loss_bbox: 0.1063, loss: 0.2683, grad_norm: 1.2753 2023-02-06 05:03:40,997 - mmrotate - INFO - Epoch [7][3500/4271] lr: 2.000e-04, eta: 2:21:52, time: 0.370, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0454, loss_cls: 0.1066, acc: 95.6282, loss_bbox: 0.1059, loss: 0.2694, grad_norm: 1.2449 2023-02-06 05:06:46,946 - mmrotate - INFO - Epoch [7][4000/4271] lr: 2.000e-04, eta: 2:18:35, time: 0.372, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0446, loss_cls: 0.1077, acc: 95.5712, loss_bbox: 0.1074, loss: 0.2707, grad_norm: 1.2399 2023-02-06 05:08:28,281 - mmrotate - INFO - Saving checkpoint at 7 epochs 2023-02-06 05:11:58,417 - mmrotate - INFO - Epoch [8][500/4271] lr: 2.000e-04, eta: 2:12:38, time: 0.417, data_time: 0.013, memory: 12592, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0440, loss_cls: 0.1042, acc: 95.7203, loss_bbox: 0.1063, loss: 0.2658, grad_norm: 1.2537 2023-02-06 05:15:24,916 - mmrotate - INFO - Epoch [8][1000/4271] lr: 2.000e-04, eta: 2:09:38, time: 0.413, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0430, loss_cls: 0.1042, acc: 95.7487, loss_bbox: 0.1058, loss: 0.2641, grad_norm: 1.2348 2023-02-06 05:18:25,763 - mmrotate - INFO - Epoch [8][1500/4271] lr: 2.000e-04, eta: 2:06:20, time: 0.362, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0422, loss_cls: 0.1049, acc: 95.7024, loss_bbox: 0.1041, loss: 0.2625, grad_norm: 1.2525 2023-02-06 05:21:30,661 - mmrotate - INFO - Epoch [8][2000/4271] lr: 2.000e-04, eta: 2:03:06, time: 0.370, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0435, loss_cls: 0.1064, acc: 95.6472, loss_bbox: 0.1058, loss: 0.2675, grad_norm: 1.2462 2023-02-06 05:24:33,982 - mmrotate - INFO - Epoch [8][2500/4271] lr: 2.000e-04, eta: 1:59:51, time: 0.367, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0448, loss_cls: 0.1068, acc: 95.6422, loss_bbox: 0.1052, loss: 0.2687, grad_norm: 1.2877 2023-02-06 05:27:38,054 - mmrotate - INFO - Epoch [8][3000/4271] lr: 2.000e-04, eta: 1:56:36, time: 0.368, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0425, loss_cls: 0.1063, acc: 95.6578, loss_bbox: 0.1058, loss: 0.2665, grad_norm: 1.2569 2023-02-06 05:30:43,865 - mmrotate - INFO - Epoch [8][3500/4271] lr: 2.000e-04, eta: 1:53:23, time: 0.372, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0438, loss_cls: 0.1064, acc: 95.6315, loss_bbox: 0.1054, loss: 0.2664, grad_norm: 1.2496 2023-02-06 05:33:47,939 - mmrotate - INFO - Epoch [8][4000/4271] lr: 2.000e-04, eta: 1:50:09, time: 0.368, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0445, loss_cls: 0.1045, acc: 95.7191, loss_bbox: 0.1036, loss: 0.2637, grad_norm: 1.2550 2023-02-06 05:35:27,634 - mmrotate - INFO - Saving checkpoint at 8 epochs 2023-02-06 05:40:20,187 - mmrotate - INFO - Epoch [9][500/4271] lr: 2.000e-05, eta: 1:45:14, time: 0.582, data_time: 0.012, memory: 12592, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0385, loss_cls: 0.0944, acc: 96.1080, loss_bbox: 0.0963, loss: 0.2390, grad_norm: 1.0461 2023-02-06 05:43:23,085 - mmrotate - INFO - Epoch [9][1000/4271] lr: 2.000e-05, eta: 1:42:01, time: 0.366, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0086, loss_rpn_bbox: 0.0372, loss_cls: 0.0926, acc: 96.1826, loss_bbox: 0.0971, loss: 0.2356, grad_norm: 1.0355 2023-02-06 05:46:26,067 - mmrotate - INFO - Epoch [9][1500/4271] lr: 2.000e-05, eta: 1:38:47, time: 0.366, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0082, loss_rpn_bbox: 0.0362, loss_cls: 0.0892, acc: 96.3030, loss_bbox: 0.0948, loss: 0.2284, grad_norm: 1.0070 2023-02-06 05:49:30,118 - mmrotate - INFO - Epoch [9][2000/4271] lr: 2.000e-05, eta: 1:35:34, time: 0.368, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0078, loss_rpn_bbox: 0.0351, loss_cls: 0.0886, acc: 96.3254, loss_bbox: 0.0922, loss: 0.2236, grad_norm: 1.0227 2023-02-06 05:52:35,856 - mmrotate - INFO - Epoch [9][2500/4271] lr: 2.000e-05, eta: 1:32:22, time: 0.371, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0076, loss_rpn_bbox: 0.0364, loss_cls: 0.0871, acc: 96.4005, loss_bbox: 0.0916, loss: 0.2227, grad_norm: 1.0058 2023-02-06 05:55:38,071 - mmrotate - INFO - Epoch [9][3000/4271] lr: 2.000e-05, eta: 1:29:09, time: 0.364, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0078, loss_rpn_bbox: 0.0355, loss_cls: 0.0873, acc: 96.3664, loss_bbox: 0.0917, loss: 0.2223, grad_norm: 0.9973 2023-02-06 05:59:02,311 - mmrotate - INFO - Epoch [9][3500/4271] lr: 2.000e-05, eta: 1:26:05, time: 0.409, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0077, loss_rpn_bbox: 0.0356, loss_cls: 0.0889, acc: 96.3078, loss_bbox: 0.0939, loss: 0.2262, grad_norm: 1.0439 2023-02-06 06:02:07,248 - mmrotate - INFO - Epoch [9][4000/4271] lr: 2.000e-05, eta: 1:22:53, time: 0.370, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0077, loss_rpn_bbox: 0.0356, loss_cls: 0.0878, acc: 96.3540, loss_bbox: 0.0923, loss: 0.2234, grad_norm: 1.0222 2023-02-06 06:03:46,815 - mmrotate - INFO - Saving checkpoint at 9 epochs 2023-02-06 06:06:16,916 - mmrotate - INFO - +--------------------+-------+-------+--------+-------+ | class | gts | dets | recall | ap | +--------------------+-------+-------+--------+-------+ | plane | 23860 | 27823 | 0.958 | 0.906 | | baseball-diamond | 1727 | 3067 | 0.946 | 0.897 | | bridge | 4142 | 10847 | 0.828 | 0.728 | | ground-track-field | 1134 | 3164 | 0.981 | 0.888 | | small-vehicle | 33183 | 56160 | 0.895 | 0.769 | | large-vehicle | 29737 | 53488 | 0.964 | 0.883 | | ship | 80574 | 90458 | 0.919 | 0.898 | | tennis-court | 4389 | 5943 | 0.985 | 0.908 | | basketball-court | 1097 | 2188 | 0.991 | 0.893 | | storage-tank | 28751 | 27449 | 0.718 | 0.712 | | soccer-ball-field | 1242 | 3136 | 0.924 | 0.853 | | roundabout | 1536 | 3496 | 0.899 | 0.810 | | harbor | 15489 | 23013 | 0.914 | 0.869 | | swimming-pool | 3456 | 6123 | 0.866 | 0.751 | | helicopter | 765 | 1562 | 0.924 | 0.877 | +--------------------+-------+-------+--------+-------+ | mAP | | | | 0.843 | +--------------------+-------+-------+--------+-------+ 2023-02-06 06:06:17,033 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 06:06:17,034 - mmrotate - INFO - Epoch(val) [9][2034] mAP: 0.8427 2023-02-06 06:09:26,548 - mmrotate - INFO - Epoch [10][500/4271] lr: 2.000e-05, eta: 1:17:27, time: 0.379, data_time: 0.013, memory: 12592, loss_rpn_cls: 0.0074, loss_rpn_bbox: 0.0355, loss_cls: 0.0861, acc: 96.4242, loss_bbox: 0.0917, loss: 0.2207, grad_norm: 1.0229 2023-02-06 06:12:31,403 - mmrotate - INFO - Epoch [10][1000/4271] lr: 2.000e-05, eta: 1:14:17, time: 0.370, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0073, loss_rpn_bbox: 0.0341, loss_cls: 0.0853, acc: 96.4492, loss_bbox: 0.0904, loss: 0.2171, grad_norm: 1.0298 2023-02-06 06:15:36,754 - mmrotate - INFO - Epoch [10][1500/4271] lr: 2.000e-05, eta: 1:11:07, time: 0.371, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0074, loss_rpn_bbox: 0.0351, loss_cls: 0.0845, acc: 96.4896, loss_bbox: 0.0892, loss: 0.2161, grad_norm: 1.0038 2023-02-06 06:18:40,106 - mmrotate - INFO - Epoch [10][2000/4271] lr: 2.000e-05, eta: 1:07:57, time: 0.367, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0076, loss_rpn_bbox: 0.0343, loss_cls: 0.0843, acc: 96.5023, loss_bbox: 0.0896, loss: 0.2158, grad_norm: 1.0460 2023-02-06 06:23:52,350 - mmrotate - INFO - Epoch [10][2500/4271] lr: 2.000e-05, eta: 1:05:20, time: 0.624, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0075, loss_rpn_bbox: 0.0351, loss_cls: 0.0845, acc: 96.4978, loss_bbox: 0.0904, loss: 0.2175, grad_norm: 1.0227 2023-02-06 06:26:55,104 - mmrotate - INFO - Epoch [10][3000/4271] lr: 2.000e-05, eta: 1:02:08, time: 0.366, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0074, loss_rpn_bbox: 0.0346, loss_cls: 0.0848, acc: 96.4708, loss_bbox: 0.0902, loss: 0.2170, grad_norm: 1.0394 2023-02-06 06:29:58,942 - mmrotate - INFO - Epoch [10][3500/4271] lr: 2.000e-05, eta: 0:58:57, time: 0.368, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0074, loss_rpn_bbox: 0.0345, loss_cls: 0.0840, acc: 96.5096, loss_bbox: 0.0898, loss: 0.2157, grad_norm: 1.0284 2023-02-06 06:32:59,996 - mmrotate - INFO - Epoch [10][4000/4271] lr: 2.000e-05, eta: 0:55:45, time: 0.362, data_time: 0.007, memory: 12592, loss_rpn_cls: 0.0071, loss_rpn_bbox: 0.0348, loss_cls: 0.0833, acc: 96.5494, loss_bbox: 0.0891, loss: 0.2143, grad_norm: 1.0510 2023-02-06 06:34:40,210 - mmrotate - INFO - Saving checkpoint at 10 epochs 2023-02-06 07:18:16,889 - mmrotate - INFO - Epoch [11][500/4271] lr: 2.000e-05, eta: 0:55:38, time: 0.415, data_time: 0.014, memory: 10715, loss_rpn_cls: 0.0071, loss_rpn_bbox: 0.0343, loss_cls: 0.0817, acc: 96.6020, loss_bbox: 0.0888, loss: 0.2119, grad_norm: 1.0294 2023-02-06 07:21:20,856 - mmrotate - INFO - Epoch [11][1000/4271] lr: 2.000e-05, eta: 0:49:13, time: 0.368, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0070, loss_rpn_bbox: 0.0347, loss_cls: 0.0852, acc: 96.4620, loss_bbox: 0.0910, loss: 0.2179, grad_norm: 1.0433 2023-02-06 07:26:12,255 - mmrotate - INFO - Epoch [11][1500/4271] lr: 2.000e-05, eta: 0:53:26, time: 0.583, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0069, loss_rpn_bbox: 0.0336, loss_cls: 0.0815, acc: 96.6126, loss_bbox: 0.0883, loss: 0.2104, grad_norm: 1.0364 2023-02-06 07:30:29,723 - mmrotate - INFO - Epoch [11][2000/4271] lr: 2.000e-05, eta: 0:51:16, time: 0.515, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0072, loss_rpn_bbox: 0.0354, loss_cls: 0.0826, acc: 96.5745, loss_bbox: 0.0891, loss: 0.2144, grad_norm: 1.0473 2023-02-06 07:35:04,670 - mmrotate - INFO - Epoch [11][2500/4271] lr: 2.000e-05, eta: 0:48:57, time: 0.550, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0068, loss_rpn_bbox: 0.0343, loss_cls: 0.0829, acc: 96.5475, loss_bbox: 0.0893, loss: 0.2133, grad_norm: 1.0484 2023-02-06 07:38:52,009 - mmrotate - INFO - Epoch [11][3000/4271] lr: 2.000e-05, eta: 0:44:25, time: 0.455, data_time: 0.007, memory: 10715, loss_rpn_cls: 0.0072, loss_rpn_bbox: 0.0346, loss_cls: 0.0824, acc: 96.5797, loss_bbox: 0.0890, loss: 0.2132, grad_norm: 1.0415 2023-02-06 07:41:55,813 - mmrotate - INFO - Epoch [11][3500/4271] lr: 2.000e-05, eta: 0:39:03, time: 0.368, data_time: 0.007, memory: 10715, loss_rpn_cls: 0.0071, loss_rpn_bbox: 0.0328, loss_cls: 0.0826, acc: 96.5540, loss_bbox: 0.0891, loss: 0.2116, grad_norm: 1.0505 2023-02-06 07:45:01,179 - mmrotate - INFO - Epoch [11][4000/4271] lr: 2.000e-05, eta: 0:34:17, time: 0.371, data_time: 0.007, memory: 10715, loss_rpn_cls: 0.0068, loss_rpn_bbox: 0.0333, loss_cls: 0.0817, acc: 96.5987, loss_bbox: 0.0878, loss: 0.2096, grad_norm: 1.0598 2023-02-06 07:46:40,832 - mmrotate - INFO - Saving checkpoint at 11 epochs 2023-02-06 07:49:48,497 - mmrotate - INFO - Epoch [12][500/4271] lr: 2.000e-06, eta: 0:26:19, time: 0.373, data_time: 0.014, memory: 10715, loss_rpn_cls: 0.0067, loss_rpn_bbox: 0.0344, loss_cls: 0.0827, acc: 96.5626, loss_bbox: 0.0893, loss: 0.2130, grad_norm: 1.0475 2023-02-06 07:53:56,334 - mmrotate - INFO - Epoch [12][1000/4271] lr: 2.000e-06, eta: 0:23:13, time: 0.496, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0067, loss_rpn_bbox: 0.0339, loss_cls: 0.0810, acc: 96.6259, loss_bbox: 0.0877, loss: 0.2094, grad_norm: 1.0439 2023-02-06 07:58:14,925 - mmrotate - INFO - Epoch [12][1500/4271] lr: 2.000e-06, eta: 0:20:02, time: 0.517, data_time: 0.009, memory: 10715, loss_rpn_cls: 0.0068, loss_rpn_bbox: 0.0349, loss_cls: 0.0817, acc: 96.5913, loss_bbox: 0.0874, loss: 0.2108, grad_norm: 1.0193 2023-02-06 08:02:22,905 - mmrotate - INFO - Epoch [12][2000/4271] lr: 2.000e-06, eta: 0:16:36, time: 0.496, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0065, loss_rpn_bbox: 0.0321, loss_cls: 0.0795, acc: 96.6875, loss_bbox: 0.0870, loss: 0.2052, grad_norm: 1.0026 2023-02-06 08:07:30,660 - mmrotate - INFO - Epoch [12][2500/4271] lr: 2.000e-06, eta: 0:13:20, time: 0.615, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0066, loss_rpn_bbox: 0.0341, loss_cls: 0.0819, acc: 96.5813, loss_bbox: 0.0884, loss: 0.2110, grad_norm: 1.0301 2023-02-06 08:10:49,469 - mmrotate - INFO - Epoch [12][3000/4271] lr: 2.000e-06, eta: 0:09:29, time: 0.398, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0067, loss_rpn_bbox: 0.0338, loss_cls: 0.0823, acc: 96.5607, loss_bbox: 0.0891, loss: 0.2119, grad_norm: 1.0249 2023-02-06 08:13:51,880 - mmrotate - INFO - Epoch [12][3500/4271] lr: 2.000e-06, eta: 0:05:41, time: 0.365, data_time: 0.009, memory: 10715, loss_rpn_cls: 0.0068, loss_rpn_bbox: 0.0343, loss_cls: 0.0813, acc: 96.6195, loss_bbox: 0.0872, loss: 0.2096, grad_norm: 1.0323 2023-02-06 08:18:43,979 - mmrotate - INFO - Epoch [12][4000/4271] lr: 2.000e-06, eta: 0:02:02, time: 0.584, data_time: 0.008, memory: 10715, loss_rpn_cls: 0.0068, loss_rpn_bbox: 0.0328, loss_cls: 0.0814, acc: 96.6185, loss_bbox: 0.0876, loss: 0.2085, grad_norm: 1.0344 2023-02-06 08:20:24,389 - mmrotate - INFO - Saving checkpoint at 12 epochs 2023-02-06 08:22:50,449 - mmrotate - INFO - +--------------------+-------+-------+--------+-------+ | class | gts | dets | recall | ap | +--------------------+-------+-------+--------+-------+ | plane | 23860 | 27782 | 0.957 | 0.906 | | baseball-diamond | 1727 | 2784 | 0.957 | 0.900 | | bridge | 4142 | 9597 | 0.837 | 0.748 | | ground-track-field | 1134 | 3148 | 0.981 | 0.892 | | small-vehicle | 33183 | 53522 | 0.893 | 0.774 | | large-vehicle | 29737 | 51354 | 0.967 | 0.889 | | ship | 80574 | 89524 | 0.919 | 0.899 | | tennis-court | 4389 | 5703 | 0.986 | 0.908 | | basketball-court | 1097 | 2077 | 0.991 | 0.900 | | storage-tank | 28751 | 26871 | 0.714 | 0.712 | | soccer-ball-field | 1242 | 2913 | 0.923 | 0.868 | | roundabout | 1536 | 3054 | 0.907 | 0.871 | | harbor | 15489 | 22232 | 0.919 | 0.875 | | swimming-pool | 3456 | 5879 | 0.873 | 0.758 | | helicopter | 765 | 1493 | 0.920 | 0.883 | +--------------------+-------+-------+--------+-------+ | mAP | | | | 0.852 | +--------------------+-------+-------+--------+-------+ 2023-02-06 08:22:50,548 - mmrotate - INFO - Exp name: lskb0_dota.py 2023-02-06 08:22:50,549 - mmrotate - INFO - Epoch(val) [12][2034] mAP: 0.8522