2023/05/27 14:03:10 - mmengine - INFO - ------------------------------------------------------------ System environment: sys.platform: linux Python: 3.9.0 (default, Nov 15 2020, 14:28:56) [GCC 7.3.0] CUDA available: True numpy_random_seed: 1356700692 GPU 0,1,2,3,4,5,6,7: NVIDIA GeForce RTX 3090 CUDA_HOME: /usr/local/cuda NVCC: Cuda compilation tools, release 11.3, V11.3.109 GCC: gcc (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0 PyTorch: 1.12.1+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.2 - Built with CuDNN 8.3.2 - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -fabi-version=11 -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.1+cu102 OpenCV: 4.7.0 MMEngine: 0.7.3 Runtime environment: cudnn_benchmark: False mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl'} seed: None diff_rank_seed: False deterministic: False Distributed launcher: pytorch Distributed training: True GPU number: 32 ------------------------------------------------------------ 2023/05/27 14:03:10 - mmengine - INFO - Config: model = dict( type='Recognizer2D', backbone=dict(type='timm.swin_base_patch4_window7_224', pretrained=True), cls_head=dict( type='TSNHead', num_classes=400, in_channels=1024, spatial_type='avg', consensus=dict(type='AvgConsensus', dim=1), dropout_ratio=0.4, init_std=0.01, average_clips='prob'), data_preprocessor=dict( type='ActionDataPreprocessor', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], format_shape='NCHW'), train_cfg=None, test_cfg=None) train_cfg = dict( type='EpochBasedTrainLoop', max_epochs=50, val_begin=1, val_interval=1) val_cfg = dict(type='ValLoop') test_cfg = dict(type='TestLoop') param_scheduler = [ dict( type='MultiStepLR', begin=0, end=50, by_epoch=True, milestones=[20, 40], gamma=0.1) ] optim_wrapper = dict( optimizer=dict(type='SGD', lr=0.01, momentum=0.9, weight_decay=0.0001), clip_grad=dict(max_norm=40, norm_type=2)) default_scope = 'mmaction' default_hooks = dict( runtime_info=dict(type='RuntimeInfoHook'), timer=dict(type='IterTimerHook'), logger=dict(type='LoggerHook', interval=20, ignore_last=False), param_scheduler=dict(type='ParamSchedulerHook'), checkpoint=dict( type='CheckpointHook', interval=3, save_best='auto', max_keep_ckpts=3), sampler_seed=dict(type='DistSamplerSeedHook'), sync_buffers=dict(type='SyncBuffersHook')) env_cfg = dict( cudnn_benchmark=False, mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0), dist_cfg=dict(backend='nccl')) log_processor = dict(type='LogProcessor', window_size=20, by_epoch=True) vis_backends = [dict(type='LocalVisBackend')] visualizer = dict( type='ActionVisualizer', vis_backends=[dict(type='LocalVisBackend')]) log_level = 'INFO' load_from = None resume = True dataset_type = 'VideoDataset' data_root = 'data/kinetics400/videos_train' data_root_val = 'data/kinetics400/videos_val' ann_file_train = 'data/kinetics400/kinetics400_train_list_videos.txt' ann_file_val = 'data/kinetics400/kinetics400_val_list_videos.txt' file_client_args = dict(io_backend='disk') train_pipeline = [ dict(type='DecordInit', io_backend='disk'), dict(type='SampleFrames', clip_len=1, frame_interval=1, num_clips=8), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict( type='MultiScaleCrop', input_size=224, scales=(1, 0.875, 0.75, 0.66), random_crop=False, max_wh_scale_gap=1), dict(type='Resize', scale=(224, 224), keep_ratio=False), dict(type='Flip', flip_ratio=0.5), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ] val_pipeline = [ dict(type='DecordInit', io_backend='disk'), dict( type='SampleFrames', clip_len=1, frame_interval=1, num_clips=8, test_mode=True), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict(type='CenterCrop', crop_size=224), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ] test_pipeline = [ dict(type='DecordInit', io_backend='disk'), dict( type='SampleFrames', clip_len=1, frame_interval=1, num_clips=25, test_mode=True), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict(type='TenCrop', crop_size=224), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ] train_dataloader = dict( batch_size=8, num_workers=8, persistent_workers=True, sampler=dict(type='DefaultSampler', shuffle=True), dataset=dict( type='VideoDataset', ann_file='data/kinetics400/kinetics400_train_list_videos.txt', data_prefix=dict(video='data/kinetics400/videos_train'), pipeline=[ dict(type='DecordInit', io_backend='disk'), dict( type='SampleFrames', clip_len=1, frame_interval=1, num_clips=8), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict( type='MultiScaleCrop', input_size=224, scales=(1, 0.875, 0.75, 0.66), random_crop=False, max_wh_scale_gap=1), dict(type='Resize', scale=(224, 224), keep_ratio=False), dict(type='Flip', flip_ratio=0.5), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ])) val_dataloader = dict( batch_size=8, num_workers=8, persistent_workers=True, sampler=dict(type='DefaultSampler', shuffle=False), dataset=dict( type='VideoDataset', ann_file='data/kinetics400/kinetics400_val_list_videos.txt', data_prefix=dict(video='data/kinetics400/videos_val'), pipeline=[ dict(type='DecordInit', io_backend='disk'), dict( type='SampleFrames', clip_len=1, frame_interval=1, num_clips=8, test_mode=True), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict(type='CenterCrop', crop_size=224), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ], test_mode=True)) test_dataloader = dict( batch_size=1, num_workers=8, persistent_workers=True, sampler=dict(type='DefaultSampler', shuffle=False), dataset=dict( type='VideoDataset', ann_file='data/kinetics400/kinetics400_val_list_videos.txt', data_prefix=dict(video='data/kinetics400/videos_val'), pipeline=[ dict(type='DecordInit', io_backend='disk'), dict( type='SampleFrames', clip_len=1, frame_interval=1, num_clips=25, test_mode=True), dict(type='DecordDecode'), dict(type='Resize', scale=(-1, 256)), dict(type='TenCrop', crop_size=224), dict(type='FormatShape', input_format='NCHW'), dict(type='PackActionInputs') ], test_mode=True)) val_evaluator = dict(type='AccMetric') test_evaluator = dict(type='AccMetric') auto_scale_lr = dict(enable=True, base_batch_size=256) launcher = 'pytorch' work_dir = './work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb' randomness = dict(seed=None, diff_rank_seed=False, deterministic=False) 2023/05/27 14:03:14 - mmengine - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) RuntimeInfoHook (BELOW_NORMAL) LoggerHook -------------------- before_train: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (VERY_LOW ) CheckpointHook -------------------- before_train_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (NORMAL ) DistSamplerSeedHook -------------------- before_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook -------------------- after_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train_epoch: (NORMAL ) IterTimerHook (NORMAL ) SyncBuffersHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- before_val_epoch: (NORMAL ) IterTimerHook (NORMAL ) SyncBuffersHook -------------------- before_val_iter: (NORMAL ) IterTimerHook -------------------- after_val_iter: (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_val_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train: (VERY_LOW ) CheckpointHook -------------------- before_test_epoch: (NORMAL ) IterTimerHook -------------------- before_test_iter: (NORMAL ) IterTimerHook -------------------- after_test_iter: (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_test_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_run: (BELOW_NORMAL) LoggerHook -------------------- 2023/05/27 14:03:16 - mmengine - INFO - LR is set based on batch size of 256 and the current batch size is 256. Scaling the original LR by 1.0. Name of parameter - Initialization information backbone.patch_embed.proj.weight - torch.Size([128, 3, 4, 4]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.patch_embed.proj.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.patch_embed.norm.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.patch_embed.norm.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.norm1.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.norm1.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.attn.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.attn.qkv.weight - torch.Size([384, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.attn.qkv.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.attn.proj.weight - torch.Size([128, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.attn.proj.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.norm2.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.norm2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.mlp.fc1.weight - torch.Size([512, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.mlp.fc1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.mlp.fc2.weight - torch.Size([128, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.0.mlp.fc2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.norm1.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.norm1.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.attn.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.attn.qkv.weight - torch.Size([384, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.attn.qkv.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.attn.proj.weight - torch.Size([128, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.attn.proj.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.norm2.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.norm2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.mlp.fc1.weight - torch.Size([512, 128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.mlp.fc1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.mlp.fc2.weight - torch.Size([128, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.blocks.1.mlp.fc2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.downsample.norm.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.downsample.norm.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.0.downsample.reduction.weight - torch.Size([256, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.attn.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.attn.qkv.weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.attn.qkv.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.attn.proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.attn.proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.norm2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.norm2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.mlp.fc1.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.mlp.fc1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.mlp.fc2.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.0.mlp.fc2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.attn.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.attn.qkv.weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.attn.qkv.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.attn.proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.attn.proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.norm2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.norm2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.mlp.fc1.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.mlp.fc1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.mlp.fc2.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.blocks.1.mlp.fc2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.downsample.norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.downsample.norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.1.downsample.reduction.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.0.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.1.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.2.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.3.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.4.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.5.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.6.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.7.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.8.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.9.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.10.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.11.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.12.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.13.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.14.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.15.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.16.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.attn.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.attn.qkv.weight - torch.Size([1536, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.attn.qkv.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.attn.proj.weight - torch.Size([512, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.attn.proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.mlp.fc1.weight - torch.Size([2048, 512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.mlp.fc1.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.mlp.fc2.weight - torch.Size([512, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.blocks.17.mlp.fc2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.downsample.norm.weight - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.downsample.norm.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.2.downsample.reduction.weight - torch.Size([1024, 2048]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.attn.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.attn.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.layers.3.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D backbone.norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of Recognizer2D cls_head.fc_cls.weight - torch.Size([400, 1024]): Initialized by user-defined `init_weights` in TSNHead cls_head.fc_cls.bias - torch.Size([400]): Initialized by user-defined `init_weights` in TSNHead 2023/05/27 14:03:17 - mmengine - INFO - Auto resumed from the latest checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/epoch_21.pth. 2023/05/27 14:03:17 - mmengine - INFO - Load checkpoint from /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/epoch_21.pth 2023/05/27 14:03:17 - mmengine - INFO - resumed epoch: 21, iter: 19740 2023/05/27 14:03:17 - mmengine - WARNING - "FileClient" will be deprecated in future. Please use io functions in https://mmengine.readthedocs.io/en/latest/api/fileio.html#file-io 2023/05/27 14:03:17 - mmengine - WARNING - "HardDiskBackend" is the alias of "LocalBackend" and the former will be deprecated in future. 2023/05/27 14:03:17 - mmengine - INFO - Checkpoints will be saved to /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb. 2023/05/27 14:03:37 - mmengine - INFO - Epoch(train) [22][ 20/940] lr: 1.0000e-03 eta: 7:32:22 time: 0.9964 data_time: 0.1189 memory: 15293 grad_norm: 3.9156 loss: 0.8659 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8659 2023/05/27 14:03:49 - mmengine - INFO - Epoch(train) [22][ 40/940] lr: 1.0000e-03 eta: 5:58:33 time: 0.5843 data_time: 0.0070 memory: 15293 grad_norm: 4.3739 loss: 0.7195 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7195 2023/05/27 14:04:01 - mmengine - INFO - Epoch(train) [22][ 60/940] lr: 1.0000e-03 eta: 5:27:30 time: 0.5866 data_time: 0.0072 memory: 15293 grad_norm: 3.9822 loss: 0.8592 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8592 2023/05/27 14:04:12 - mmengine - INFO - Epoch(train) [22][ 80/940] lr: 1.0000e-03 eta: 5:11:14 time: 0.5809 data_time: 0.0071 memory: 15293 grad_norm: 4.0947 loss: 0.6799 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6799 2023/05/27 14:04:24 - mmengine - INFO - Epoch(train) [22][100/940] lr: 1.0000e-03 eta: 5:01:52 time: 0.5860 data_time: 0.0072 memory: 15293 grad_norm: 4.1846 loss: 0.9910 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.9910 2023/05/27 14:04:36 - mmengine - INFO - Epoch(train) [22][120/940] lr: 1.0000e-03 eta: 4:55:58 time: 0.5916 data_time: 0.0069 memory: 15293 grad_norm: 4.1688 loss: 0.8423 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8423 2023/05/27 14:04:48 - mmengine - INFO - Epoch(train) [22][140/940] lr: 1.0000e-03 eta: 4:51:30 time: 0.5886 data_time: 0.0078 memory: 15293 grad_norm: 4.3411 loss: 0.9464 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.9464 2023/05/27 14:05:00 - mmengine - INFO - Epoch(train) [22][160/940] lr: 1.0000e-03 eta: 4:48:04 time: 0.5879 data_time: 0.0078 memory: 15293 grad_norm: 4.0842 loss: 0.7459 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7459 2023/05/27 14:05:11 - mmengine - INFO - Epoch(train) [22][180/940] lr: 1.0000e-03 eta: 4:45:40 time: 0.5942 data_time: 0.0072 memory: 15293 grad_norm: 4.8443 loss: 0.7351 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7351 2023/05/27 14:05:23 - mmengine - INFO - Epoch(train) [22][200/940] lr: 1.0000e-03 eta: 4:43:19 time: 0.5857 data_time: 0.0074 memory: 15293 grad_norm: 4.0792 loss: 0.8795 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.8795 2023/05/27 14:05:35 - mmengine - INFO - Epoch(train) [22][220/940] lr: 1.0000e-03 eta: 4:41:20 time: 0.5849 data_time: 0.0072 memory: 15293 grad_norm: 4.0099 loss: 0.6296 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6296 2023/05/27 14:05:47 - mmengine - INFO - Epoch(train) [22][240/940] lr: 1.0000e-03 eta: 4:39:40 time: 0.5854 data_time: 0.0070 memory: 15293 grad_norm: 6.1143 loss: 0.7018 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7018 2023/05/27 14:05:58 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:05:58 - mmengine - INFO - Epoch(train) [22][260/940] lr: 1.0000e-03 eta: 4:38:32 time: 0.5941 data_time: 0.0069 memory: 15293 grad_norm: 4.2407 loss: 0.9458 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.9458 2023/05/27 14:06:10 - mmengine - INFO - Epoch(train) [22][280/940] lr: 1.0000e-03 eta: 4:37:15 time: 0.5855 data_time: 0.0071 memory: 15293 grad_norm: 5.2977 loss: 0.9981 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.9981 2023/05/27 14:06:22 - mmengine - INFO - Epoch(train) [22][300/940] lr: 1.0000e-03 eta: 4:36:08 time: 0.5861 data_time: 0.0075 memory: 15293 grad_norm: 4.1108 loss: 0.6858 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6858 2023/05/27 14:06:34 - mmengine - INFO - Epoch(train) [22][320/940] lr: 1.0000e-03 eta: 4:35:15 time: 0.5907 data_time: 0.0071 memory: 15293 grad_norm: 4.2238 loss: 0.8794 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8794 2023/05/27 14:06:46 - mmengine - INFO - Epoch(train) [22][340/940] lr: 1.0000e-03 eta: 4:34:37 time: 0.5966 data_time: 0.0078 memory: 15293 grad_norm: 4.7393 loss: 0.6965 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6965 2023/05/27 14:06:57 - mmengine - INFO - Epoch(train) [22][360/940] lr: 1.0000e-03 eta: 4:33:52 time: 0.5902 data_time: 0.0073 memory: 15293 grad_norm: 4.4083 loss: 0.7346 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7346 2023/05/27 14:07:09 - mmengine - INFO - Epoch(train) [22][380/940] lr: 1.0000e-03 eta: 4:33:06 time: 0.5868 data_time: 0.0075 memory: 15293 grad_norm: 4.4153 loss: 0.9033 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.9033 2023/05/27 14:07:21 - mmengine - INFO - Epoch(train) [22][400/940] lr: 1.0000e-03 eta: 4:32:26 time: 0.5892 data_time: 0.0070 memory: 15293 grad_norm: 4.2976 loss: 0.8014 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8014 2023/05/27 14:07:33 - mmengine - INFO - Epoch(train) [22][420/940] lr: 1.0000e-03 eta: 4:31:47 time: 0.5877 data_time: 0.0071 memory: 15293 grad_norm: 5.2022 loss: 0.6721 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.6721 2023/05/27 14:07:45 - mmengine - INFO - Epoch(train) [22][440/940] lr: 1.0000e-03 eta: 4:31:14 time: 0.5904 data_time: 0.0070 memory: 15293 grad_norm: 4.2307 loss: 0.5401 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5401 2023/05/27 14:07:56 - mmengine - INFO - Epoch(train) [22][460/940] lr: 1.0000e-03 eta: 4:30:40 time: 0.5883 data_time: 0.0075 memory: 15293 grad_norm: 4.1984 loss: 0.7094 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7094 2023/05/27 14:08:08 - mmengine - INFO - Epoch(train) [22][480/940] lr: 1.0000e-03 eta: 4:30:05 time: 0.5851 data_time: 0.0072 memory: 15293 grad_norm: 4.1192 loss: 0.7614 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7614 2023/05/27 14:08:20 - mmengine - INFO - Epoch(train) [22][500/940] lr: 1.0000e-03 eta: 4:29:42 time: 0.5952 data_time: 0.0072 memory: 15293 grad_norm: 4.1680 loss: 0.6972 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6972 2023/05/27 14:08:32 - mmengine - INFO - Epoch(train) [22][520/940] lr: 1.0000e-03 eta: 4:29:18 time: 0.5928 data_time: 0.0071 memory: 15293 grad_norm: 4.0843 loss: 0.7826 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7826 2023/05/27 14:08:44 - mmengine - INFO - Epoch(train) [22][540/940] lr: 1.0000e-03 eta: 4:28:52 time: 0.5904 data_time: 0.0070 memory: 15293 grad_norm: 4.1511 loss: 0.7577 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7577 2023/05/27 14:08:55 - mmengine - INFO - Epoch(train) [22][560/940] lr: 1.0000e-03 eta: 4:28:28 time: 0.5908 data_time: 0.0072 memory: 15293 grad_norm: 4.2491 loss: 0.8032 top1_acc: 0.6250 top5_acc: 0.6250 loss_cls: 0.8032 2023/05/27 14:09:07 - mmengine - INFO - Epoch(train) [22][580/940] lr: 1.0000e-03 eta: 4:27:59 time: 0.5856 data_time: 0.0074 memory: 15293 grad_norm: 4.3389 loss: 0.6422 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6422 2023/05/27 14:09:19 - mmengine - INFO - Epoch(train) [22][600/940] lr: 1.0000e-03 eta: 4:27:33 time: 0.5866 data_time: 0.0072 memory: 15293 grad_norm: 6.4753 loss: 0.7601 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7601 2023/05/27 14:09:31 - mmengine - INFO - Epoch(train) [22][620/940] lr: 1.0000e-03 eta: 4:27:07 time: 0.5856 data_time: 0.0074 memory: 15293 grad_norm: 4.6514 loss: 0.7834 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7834 2023/05/27 14:09:42 - mmengine - INFO - Epoch(train) [22][640/940] lr: 1.0000e-03 eta: 4:26:49 time: 0.5949 data_time: 0.0072 memory: 15293 grad_norm: 4.6183 loss: 0.6440 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6440 2023/05/27 14:09:54 - mmengine - INFO - Epoch(train) [22][660/940] lr: 1.0000e-03 eta: 4:26:24 time: 0.5850 data_time: 0.0074 memory: 15293 grad_norm: 4.3853 loss: 0.9169 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.9169 2023/05/27 14:10:06 - mmengine - INFO - Epoch(train) [22][680/940] lr: 1.0000e-03 eta: 4:26:00 time: 0.5862 data_time: 0.0080 memory: 15293 grad_norm: 4.4458 loss: 0.7673 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.7673 2023/05/27 14:10:18 - mmengine - INFO - Epoch(train) [22][700/940] lr: 1.0000e-03 eta: 4:25:40 time: 0.5899 data_time: 0.0072 memory: 15293 grad_norm: 4.1167 loss: 0.6325 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6325 2023/05/27 14:10:29 - mmengine - INFO - Epoch(train) [22][720/940] lr: 1.0000e-03 eta: 4:25:18 time: 0.5858 data_time: 0.0072 memory: 15293 grad_norm: 4.1606 loss: 0.7471 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7471 2023/05/27 14:10:41 - mmengine - INFO - Epoch(train) [22][740/940] lr: 1.0000e-03 eta: 4:24:58 time: 0.5888 data_time: 0.0074 memory: 15293 grad_norm: 4.2152 loss: 0.9193 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.9193 2023/05/27 14:10:53 - mmengine - INFO - Epoch(train) [22][760/940] lr: 1.0000e-03 eta: 4:24:38 time: 0.5885 data_time: 0.0072 memory: 15293 grad_norm: 5.4508 loss: 0.8029 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8029 2023/05/27 14:11:05 - mmengine - INFO - Epoch(train) [22][780/940] lr: 1.0000e-03 eta: 4:24:17 time: 0.5863 data_time: 0.0072 memory: 15293 grad_norm: 4.3156 loss: 0.7260 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7260 2023/05/27 14:11:16 - mmengine - INFO - Epoch(train) [22][800/940] lr: 1.0000e-03 eta: 4:24:02 time: 0.5933 data_time: 0.0071 memory: 15293 grad_norm: 4.0563 loss: 0.5960 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5960 2023/05/27 14:11:28 - mmengine - INFO - Epoch(train) [22][820/940] lr: 1.0000e-03 eta: 4:23:41 time: 0.5858 data_time: 0.0073 memory: 15293 grad_norm: 5.1779 loss: 0.6107 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6107 2023/05/27 14:11:40 - mmengine - INFO - Epoch(train) [22][840/940] lr: 1.0000e-03 eta: 4:23:27 time: 0.5945 data_time: 0.0072 memory: 15293 grad_norm: 4.1734 loss: 0.8278 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8278 2023/05/27 14:11:52 - mmengine - INFO - Epoch(train) [22][860/940] lr: 1.0000e-03 eta: 4:23:08 time: 0.5876 data_time: 0.0073 memory: 15293 grad_norm: 4.2060 loss: 0.7401 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7401 2023/05/27 14:12:04 - mmengine - INFO - Epoch(train) [22][880/940] lr: 1.0000e-03 eta: 4:22:50 time: 0.5868 data_time: 0.0076 memory: 15293 grad_norm: 4.9409 loss: 0.9544 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.9544 2023/05/27 14:12:15 - mmengine - INFO - Epoch(train) [22][900/940] lr: 1.0000e-03 eta: 4:22:34 time: 0.5918 data_time: 0.0077 memory: 15293 grad_norm: 4.3099 loss: 0.6425 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6425 2023/05/27 14:12:27 - mmengine - INFO - Epoch(train) [22][920/940] lr: 1.0000e-03 eta: 4:22:15 time: 0.5855 data_time: 0.0075 memory: 15293 grad_norm: 4.3082 loss: 0.7594 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7594 2023/05/27 14:12:39 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:12:39 - mmengine - INFO - Epoch(train) [22][940/940] lr: 1.0000e-03 eta: 4:21:48 time: 0.5706 data_time: 0.0073 memory: 15293 grad_norm: 4.4435 loss: 0.5516 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5516 2023/05/27 14:12:46 - mmengine - INFO - Epoch(val) [22][20/78] eta: 0:00:20 time: 0.3598 data_time: 0.1751 memory: 2851 2023/05/27 14:12:50 - mmengine - INFO - Epoch(val) [22][40/78] eta: 0:00:11 time: 0.2237 data_time: 0.0391 memory: 2851 2023/05/27 14:12:55 - mmengine - INFO - Epoch(val) [22][60/78] eta: 0:00:04 time: 0.2185 data_time: 0.0341 memory: 2851 2023/05/27 14:13:02 - mmengine - INFO - Epoch(val) [22][78/78] acc/top1: 0.7671 acc/top5: 0.9278 acc/mean1: 0.7670 data_time: 0.0639 time: 0.2454 2023/05/27 14:13:04 - mmengine - INFO - The best checkpoint with 0.7671 acc/top1 at 22 epoch is saved to best_acc_top1_epoch_22.pth. 2023/05/27 14:13:17 - mmengine - INFO - Epoch(train) [23][ 20/940] lr: 1.0000e-03 eta: 4:22:16 time: 0.6699 data_time: 0.0888 memory: 15293 grad_norm: 4.1160 loss: 0.7138 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7138 2023/05/27 14:13:29 - mmengine - INFO - Epoch(train) [23][ 40/940] lr: 1.0000e-03 eta: 4:22:00 time: 0.5892 data_time: 0.0073 memory: 15293 grad_norm: 4.0883 loss: 0.8942 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8942 2023/05/27 14:13:40 - mmengine - INFO - Epoch(train) [23][ 60/940] lr: 1.0000e-03 eta: 4:21:42 time: 0.5867 data_time: 0.0072 memory: 15293 grad_norm: 4.2432 loss: 0.5801 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5801 2023/05/27 14:13:52 - mmengine - INFO - Epoch(train) [23][ 80/940] lr: 1.0000e-03 eta: 4:21:29 time: 0.5960 data_time: 0.0071 memory: 15293 grad_norm: 4.1888 loss: 0.6047 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6047 2023/05/27 14:14:04 - mmengine - INFO - Epoch(train) [23][100/940] lr: 1.0000e-03 eta: 4:21:14 time: 0.5936 data_time: 0.0073 memory: 15293 grad_norm: 4.0738 loss: 0.5669 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5669 2023/05/27 14:14:16 - mmengine - INFO - Epoch(train) [23][120/940] lr: 1.0000e-03 eta: 4:21:02 time: 0.5968 data_time: 0.0072 memory: 15293 grad_norm: 4.1228 loss: 0.7866 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7866 2023/05/27 14:14:28 - mmengine - INFO - Epoch(train) [23][140/940] lr: 1.0000e-03 eta: 4:20:44 time: 0.5854 data_time: 0.0073 memory: 15293 grad_norm: 4.2053 loss: 0.6400 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6400 2023/05/27 14:14:40 - mmengine - INFO - Epoch(train) [23][160/940] lr: 1.0000e-03 eta: 4:20:29 time: 0.5908 data_time: 0.0072 memory: 15293 grad_norm: 4.2568 loss: 0.9460 top1_acc: 0.3750 top5_acc: 0.6250 loss_cls: 0.9460 2023/05/27 14:14:52 - mmengine - INFO - Epoch(train) [23][180/940] lr: 1.0000e-03 eta: 4:20:16 time: 0.5960 data_time: 0.0074 memory: 15293 grad_norm: 4.4709 loss: 0.6903 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6903 2023/05/27 14:15:04 - mmengine - INFO - Epoch(train) [23][200/940] lr: 1.0000e-03 eta: 4:20:03 time: 0.5945 data_time: 0.0074 memory: 15293 grad_norm: 4.7294 loss: 0.5895 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5895 2023/05/27 14:15:15 - mmengine - INFO - Epoch(train) [23][220/940] lr: 1.0000e-03 eta: 4:19:46 time: 0.5865 data_time: 0.0076 memory: 15293 grad_norm: 4.9056 loss: 0.8910 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.8910 2023/05/27 14:15:27 - mmengine - INFO - Epoch(train) [23][240/940] lr: 1.0000e-03 eta: 4:19:29 time: 0.5858 data_time: 0.0074 memory: 15293 grad_norm: 4.0412 loss: 0.5057 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.5057 2023/05/27 14:15:39 - mmengine - INFO - Epoch(train) [23][260/940] lr: 1.0000e-03 eta: 4:19:16 time: 0.5951 data_time: 0.0073 memory: 15293 grad_norm: 4.4835 loss: 0.5620 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.5620 2023/05/27 14:15:51 - mmengine - INFO - Epoch(train) [23][280/940] lr: 1.0000e-03 eta: 4:19:01 time: 0.5887 data_time: 0.0075 memory: 15293 grad_norm: 4.1028 loss: 0.8587 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8587 2023/05/27 14:16:02 - mmengine - INFO - Epoch(train) [23][300/940] lr: 1.0000e-03 eta: 4:18:45 time: 0.5876 data_time: 0.0074 memory: 15293 grad_norm: 4.1052 loss: 0.5271 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5271 2023/05/27 14:16:14 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:16:14 - mmengine - INFO - Epoch(train) [23][320/940] lr: 1.0000e-03 eta: 4:18:30 time: 0.5883 data_time: 0.0074 memory: 15293 grad_norm: 3.9038 loss: 0.7160 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7160 2023/05/27 14:16:26 - mmengine - INFO - Epoch(train) [23][340/940] lr: 1.0000e-03 eta: 4:18:13 time: 0.5853 data_time: 0.0078 memory: 15293 grad_norm: 5.2131 loss: 0.5929 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5929 2023/05/27 14:16:38 - mmengine - INFO - Epoch(train) [23][360/940] lr: 1.0000e-03 eta: 4:17:58 time: 0.5872 data_time: 0.0075 memory: 15293 grad_norm: 4.2264 loss: 0.7125 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7125 2023/05/27 14:16:49 - mmengine - INFO - Epoch(train) [23][380/940] lr: 1.0000e-03 eta: 4:17:42 time: 0.5878 data_time: 0.0088 memory: 15293 grad_norm: 4.6575 loss: 0.9861 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.9861 2023/05/27 14:17:01 - mmengine - INFO - Epoch(train) [23][400/940] lr: 1.0000e-03 eta: 4:17:27 time: 0.5878 data_time: 0.0077 memory: 15293 grad_norm: 4.0935 loss: 0.6764 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6764 2023/05/27 14:17:13 - mmengine - INFO - Epoch(train) [23][420/940] lr: 1.0000e-03 eta: 4:17:12 time: 0.5872 data_time: 0.0077 memory: 15293 grad_norm: 4.2226 loss: 0.5320 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5320 2023/05/27 14:17:25 - mmengine - INFO - Epoch(train) [23][440/940] lr: 1.0000e-03 eta: 4:16:59 time: 0.5929 data_time: 0.0074 memory: 15293 grad_norm: 4.1080 loss: 0.6973 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6973 2023/05/27 14:17:37 - mmengine - INFO - Epoch(train) [23][460/940] lr: 1.0000e-03 eta: 4:16:45 time: 0.5907 data_time: 0.0078 memory: 15293 grad_norm: 5.4361 loss: 0.7217 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7217 2023/05/27 14:17:48 - mmengine - INFO - Epoch(train) [23][480/940] lr: 1.0000e-03 eta: 4:16:31 time: 0.5890 data_time: 0.0074 memory: 15293 grad_norm: 4.8668 loss: 0.7105 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7105 2023/05/27 14:18:00 - mmengine - INFO - Epoch(train) [23][500/940] lr: 1.0000e-03 eta: 4:16:16 time: 0.5866 data_time: 0.0076 memory: 15293 grad_norm: 4.6744 loss: 0.8735 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8735 2023/05/27 14:18:12 - mmengine - INFO - Epoch(train) [23][520/940] lr: 1.0000e-03 eta: 4:16:00 time: 0.5858 data_time: 0.0075 memory: 15293 grad_norm: 4.7664 loss: 0.7530 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7530 2023/05/27 14:18:24 - mmengine - INFO - Epoch(train) [23][540/940] lr: 1.0000e-03 eta: 4:15:48 time: 0.5953 data_time: 0.0077 memory: 15293 grad_norm: 4.2394 loss: 0.7181 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7181 2023/05/27 14:18:35 - mmengine - INFO - Epoch(train) [23][560/940] lr: 1.0000e-03 eta: 4:15:34 time: 0.5892 data_time: 0.0074 memory: 15293 grad_norm: 4.0848 loss: 0.6415 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6415 2023/05/27 14:18:47 - mmengine - INFO - Epoch(train) [23][580/940] lr: 1.0000e-03 eta: 4:15:21 time: 0.5899 data_time: 0.0075 memory: 15293 grad_norm: 4.1552 loss: 0.6237 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6237 2023/05/27 14:18:59 - mmengine - INFO - Epoch(train) [23][600/940] lr: 1.0000e-03 eta: 4:15:06 time: 0.5886 data_time: 0.0075 memory: 15293 grad_norm: 4.9584 loss: 0.7365 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7365 2023/05/27 14:19:11 - mmengine - INFO - Epoch(train) [23][620/940] lr: 1.0000e-03 eta: 4:14:53 time: 0.5896 data_time: 0.0076 memory: 15293 grad_norm: 4.2819 loss: 0.7883 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7883 2023/05/27 14:19:23 - mmengine - INFO - Epoch(train) [23][640/940] lr: 1.0000e-03 eta: 4:14:42 time: 0.5980 data_time: 0.0076 memory: 15293 grad_norm: 4.4235 loss: 0.7605 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7605 2023/05/27 14:19:35 - mmengine - INFO - Epoch(train) [23][660/940] lr: 1.0000e-03 eta: 4:14:28 time: 0.5901 data_time: 0.0075 memory: 15293 grad_norm: 5.0971 loss: 0.9433 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.9433 2023/05/27 14:19:46 - mmengine - INFO - Epoch(train) [23][680/940] lr: 1.0000e-03 eta: 4:14:13 time: 0.5849 data_time: 0.0077 memory: 15293 grad_norm: 5.3549 loss: 0.7007 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7007 2023/05/27 14:19:58 - mmengine - INFO - Epoch(train) [23][700/940] lr: 1.0000e-03 eta: 4:14:00 time: 0.5923 data_time: 0.0075 memory: 15293 grad_norm: 4.3845 loss: 0.6367 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6367 2023/05/27 14:20:10 - mmengine - INFO - Epoch(train) [23][720/940] lr: 1.0000e-03 eta: 4:13:53 time: 0.6093 data_time: 0.0291 memory: 15293 grad_norm: 4.3937 loss: 0.7483 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7483 2023/05/27 14:20:22 - mmengine - INFO - Epoch(train) [23][740/940] lr: 1.0000e-03 eta: 4:13:44 time: 0.6035 data_time: 0.0192 memory: 15293 grad_norm: 4.3044 loss: 0.7126 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7126 2023/05/27 14:20:34 - mmengine - INFO - Epoch(train) [23][760/940] lr: 1.0000e-03 eta: 4:13:30 time: 0.5887 data_time: 0.0076 memory: 15293 grad_norm: 4.1679 loss: 0.6235 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6235 2023/05/27 14:20:46 - mmengine - INFO - Epoch(train) [23][780/940] lr: 1.0000e-03 eta: 4:13:19 time: 0.5982 data_time: 0.0105 memory: 15293 grad_norm: 4.0497 loss: 0.9368 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.9368 2023/05/27 14:20:58 - mmengine - INFO - Epoch(train) [23][800/940] lr: 1.0000e-03 eta: 4:13:08 time: 0.5980 data_time: 0.0078 memory: 15293 grad_norm: 4.2823 loss: 0.9178 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.9178 2023/05/27 14:21:10 - mmengine - INFO - Epoch(train) [23][820/940] lr: 1.0000e-03 eta: 4:12:55 time: 0.5940 data_time: 0.0120 memory: 15293 grad_norm: 4.4119 loss: 0.7122 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7122 2023/05/27 14:21:22 - mmengine - INFO - Epoch(train) [23][840/940] lr: 1.0000e-03 eta: 4:12:48 time: 0.6112 data_time: 0.0247 memory: 15293 grad_norm: 4.2024 loss: 0.8830 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.8830 2023/05/27 14:21:34 - mmengine - INFO - Epoch(train) [23][860/940] lr: 1.0000e-03 eta: 4:12:38 time: 0.6015 data_time: 0.0141 memory: 15293 grad_norm: 4.6771 loss: 0.8610 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8610 2023/05/27 14:21:46 - mmengine - INFO - Epoch(train) [23][880/940] lr: 1.0000e-03 eta: 4:12:27 time: 0.5980 data_time: 0.0074 memory: 15293 grad_norm: 4.4343 loss: 0.6629 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6629 2023/05/27 14:21:58 - mmengine - INFO - Epoch(train) [23][900/940] lr: 1.0000e-03 eta: 4:12:16 time: 0.5978 data_time: 0.0114 memory: 15293 grad_norm: 4.5882 loss: 0.7597 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7597 2023/05/27 14:22:10 - mmengine - INFO - Epoch(train) [23][920/940] lr: 1.0000e-03 eta: 4:12:01 time: 0.5865 data_time: 0.0076 memory: 15293 grad_norm: 4.6457 loss: 0.7522 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7522 2023/05/27 14:22:21 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:22:21 - mmengine - INFO - Epoch(train) [23][940/940] lr: 1.0000e-03 eta: 4:11:43 time: 0.5708 data_time: 0.0123 memory: 15293 grad_norm: 4.4249 loss: 0.6955 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6955 2023/05/27 14:22:27 - mmengine - INFO - Epoch(val) [23][20/78] eta: 0:00:17 time: 0.3055 data_time: 0.1210 memory: 2851 2023/05/27 14:22:32 - mmengine - INFO - Epoch(val) [23][40/78] eta: 0:00:10 time: 0.2445 data_time: 0.0599 memory: 2851 2023/05/27 14:22:37 - mmengine - INFO - Epoch(val) [23][60/78] eta: 0:00:04 time: 0.2330 data_time: 0.0484 memory: 2851 2023/05/27 14:22:44 - mmengine - INFO - Epoch(val) [23][78/78] acc/top1: 0.7664 acc/top5: 0.9291 acc/mean1: 0.7663 data_time: 0.0591 time: 0.2407 2023/05/27 14:22:58 - mmengine - INFO - Epoch(train) [24][ 20/940] lr: 1.0000e-03 eta: 4:11:58 time: 0.6984 data_time: 0.0740 memory: 15293 grad_norm: 4.3437 loss: 0.5952 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5952 2023/05/27 14:23:10 - mmengine - INFO - Epoch(train) [24][ 40/940] lr: 1.0000e-03 eta: 4:11:45 time: 0.5895 data_time: 0.0072 memory: 15293 grad_norm: 4.4824 loss: 0.5857 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5857 2023/05/27 14:23:22 - mmengine - INFO - Epoch(train) [24][ 60/940] lr: 1.0000e-03 eta: 4:11:36 time: 0.6094 data_time: 0.0279 memory: 15293 grad_norm: 4.1345 loss: 0.6733 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6733 2023/05/27 14:23:34 - mmengine - INFO - Epoch(train) [24][ 80/940] lr: 1.0000e-03 eta: 4:11:25 time: 0.5979 data_time: 0.0189 memory: 15293 grad_norm: 4.1118 loss: 0.8965 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8965 2023/05/27 14:23:46 - mmengine - INFO - Epoch(train) [24][100/940] lr: 1.0000e-03 eta: 4:11:14 time: 0.5997 data_time: 0.0210 memory: 15293 grad_norm: 6.0653 loss: 0.7664 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7664 2023/05/27 14:23:58 - mmengine - INFO - Epoch(train) [24][120/940] lr: 1.0000e-03 eta: 4:11:00 time: 0.5889 data_time: 0.0098 memory: 15293 grad_norm: 4.3522 loss: 0.7715 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7715 2023/05/27 14:24:10 - mmengine - INFO - Epoch(train) [24][140/940] lr: 1.0000e-03 eta: 4:10:48 time: 0.5973 data_time: 0.0176 memory: 15293 grad_norm: 4.0341 loss: 0.6566 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6566 2023/05/27 14:24:22 - mmengine - INFO - Epoch(train) [24][160/940] lr: 1.0000e-03 eta: 4:10:37 time: 0.6002 data_time: 0.0194 memory: 15293 grad_norm: 4.2447 loss: 0.6467 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6467 2023/05/27 14:24:34 - mmengine - INFO - Epoch(train) [24][180/940] lr: 1.0000e-03 eta: 4:10:27 time: 0.6022 data_time: 0.0248 memory: 15293 grad_norm: 4.3451 loss: 0.7477 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.7477 2023/05/27 14:24:46 - mmengine - INFO - Epoch(train) [24][200/940] lr: 1.0000e-03 eta: 4:10:13 time: 0.5889 data_time: 0.0084 memory: 15293 grad_norm: 4.3724 loss: 0.5988 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5988 2023/05/27 14:24:58 - mmengine - INFO - Epoch(train) [24][220/940] lr: 1.0000e-03 eta: 4:10:00 time: 0.5916 data_time: 0.0080 memory: 15293 grad_norm: 4.1991 loss: 0.6116 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6116 2023/05/27 14:25:10 - mmengine - INFO - Epoch(train) [24][240/940] lr: 1.0000e-03 eta: 4:09:46 time: 0.5882 data_time: 0.0069 memory: 15293 grad_norm: 4.1709 loss: 0.5962 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5962 2023/05/27 14:25:21 - mmengine - INFO - Epoch(train) [24][260/940] lr: 1.0000e-03 eta: 4:09:33 time: 0.5886 data_time: 0.0075 memory: 15293 grad_norm: 4.0798 loss: 0.6565 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6565 2023/05/27 14:25:33 - mmengine - INFO - Epoch(train) [24][280/940] lr: 1.0000e-03 eta: 4:09:20 time: 0.5916 data_time: 0.0113 memory: 15293 grad_norm: 4.6798 loss: 0.7128 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7128 2023/05/27 14:25:45 - mmengine - INFO - Epoch(train) [24][300/940] lr: 1.0000e-03 eta: 4:09:06 time: 0.5884 data_time: 0.0077 memory: 15293 grad_norm: 4.1970 loss: 0.7229 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7229 2023/05/27 14:25:57 - mmengine - INFO - Epoch(train) [24][320/940] lr: 1.0000e-03 eta: 4:08:57 time: 0.6110 data_time: 0.0269 memory: 15293 grad_norm: 4.2116 loss: 0.8636 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8636 2023/05/27 14:26:09 - mmengine - INFO - Epoch(train) [24][340/940] lr: 1.0000e-03 eta: 4:08:49 time: 0.6108 data_time: 0.0299 memory: 15293 grad_norm: 4.7049 loss: 0.7876 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7876 2023/05/27 14:26:21 - mmengine - INFO - Epoch(train) [24][360/940] lr: 1.0000e-03 eta: 4:08:37 time: 0.5961 data_time: 0.0173 memory: 15293 grad_norm: 4.2471 loss: 0.6414 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6414 2023/05/27 14:26:33 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:26:33 - mmengine - INFO - Epoch(train) [24][380/940] lr: 1.0000e-03 eta: 4:08:22 time: 0.5851 data_time: 0.0081 memory: 15293 grad_norm: 4.1685 loss: 1.0046 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 1.0046 2023/05/27 14:26:45 - mmengine - INFO - Epoch(train) [24][400/940] lr: 1.0000e-03 eta: 4:08:10 time: 0.5944 data_time: 0.0153 memory: 15293 grad_norm: 4.6668 loss: 0.6411 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6411 2023/05/27 14:26:57 - mmengine - INFO - Epoch(train) [24][420/940] lr: 1.0000e-03 eta: 4:07:56 time: 0.5864 data_time: 0.0076 memory: 15293 grad_norm: 4.0969 loss: 0.7330 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7330 2023/05/27 14:27:08 - mmengine - INFO - Epoch(train) [24][440/940] lr: 1.0000e-03 eta: 4:07:43 time: 0.5905 data_time: 0.0075 memory: 15293 grad_norm: 4.1566 loss: 0.7319 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7319 2023/05/27 14:27:20 - mmengine - INFO - Epoch(train) [24][460/940] lr: 1.0000e-03 eta: 4:07:31 time: 0.5976 data_time: 0.0084 memory: 15293 grad_norm: 4.0627 loss: 0.6432 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6432 2023/05/27 14:27:33 - mmengine - INFO - Epoch(train) [24][480/940] lr: 1.0000e-03 eta: 4:07:25 time: 0.6237 data_time: 0.0397 memory: 15293 grad_norm: 4.8219 loss: 0.6794 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6794 2023/05/27 14:27:45 - mmengine - INFO - Epoch(train) [24][500/940] lr: 1.0000e-03 eta: 4:07:14 time: 0.5972 data_time: 0.0181 memory: 15293 grad_norm: 4.0530 loss: 0.7886 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7886 2023/05/27 14:27:57 - mmengine - INFO - Epoch(train) [24][520/940] lr: 1.0000e-03 eta: 4:07:00 time: 0.5878 data_time: 0.0100 memory: 15293 grad_norm: 4.2375 loss: 0.5470 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5470 2023/05/27 14:28:09 - mmengine - INFO - Epoch(train) [24][540/940] lr: 1.0000e-03 eta: 4:06:49 time: 0.6023 data_time: 0.0233 memory: 15293 grad_norm: 3.9723 loss: 0.7676 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7676 2023/05/27 14:28:21 - mmengine - INFO - Epoch(train) [24][560/940] lr: 1.0000e-03 eta: 4:06:39 time: 0.6025 data_time: 0.0247 memory: 15293 grad_norm: 3.9928 loss: 0.6643 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6643 2023/05/27 14:28:32 - mmengine - INFO - Epoch(train) [24][580/940] lr: 1.0000e-03 eta: 4:06:26 time: 0.5921 data_time: 0.0117 memory: 15293 grad_norm: 4.3195 loss: 0.8694 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8694 2023/05/27 14:28:44 - mmengine - INFO - Epoch(train) [24][600/940] lr: 1.0000e-03 eta: 4:06:13 time: 0.5897 data_time: 0.0091 memory: 15293 grad_norm: 4.5320 loss: 0.7331 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7331 2023/05/27 14:28:56 - mmengine - INFO - Epoch(train) [24][620/940] lr: 1.0000e-03 eta: 4:05:59 time: 0.5858 data_time: 0.0077 memory: 15293 grad_norm: 4.2316 loss: 0.7422 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7422 2023/05/27 14:29:08 - mmengine - INFO - Epoch(train) [24][640/940] lr: 1.0000e-03 eta: 4:05:46 time: 0.5909 data_time: 0.0075 memory: 15293 grad_norm: 4.2849 loss: 0.5840 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5840 2023/05/27 14:29:20 - mmengine - INFO - Epoch(train) [24][660/940] lr: 1.0000e-03 eta: 4:05:35 time: 0.6001 data_time: 0.0199 memory: 15293 grad_norm: 4.2175 loss: 0.7340 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7340 2023/05/27 14:29:32 - mmengine - INFO - Epoch(train) [24][680/940] lr: 1.0000e-03 eta: 4:05:23 time: 0.5948 data_time: 0.0146 memory: 15293 grad_norm: 4.2281 loss: 0.8472 top1_acc: 0.3750 top5_acc: 0.7500 loss_cls: 0.8472 2023/05/27 14:29:44 - mmengine - INFO - Epoch(train) [24][700/940] lr: 1.0000e-03 eta: 4:05:10 time: 0.5919 data_time: 0.0144 memory: 15293 grad_norm: 4.1970 loss: 0.5787 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5787 2023/05/27 14:29:55 - mmengine - INFO - Epoch(train) [24][720/940] lr: 1.0000e-03 eta: 4:04:56 time: 0.5862 data_time: 0.0076 memory: 15293 grad_norm: 5.3120 loss: 0.6606 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6606 2023/05/27 14:30:08 - mmengine - INFO - Epoch(train) [24][740/940] lr: 1.0000e-03 eta: 4:04:49 time: 0.6207 data_time: 0.0377 memory: 15293 grad_norm: 4.6061 loss: 0.7923 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7923 2023/05/27 14:30:20 - mmengine - INFO - Epoch(train) [24][760/940] lr: 1.0000e-03 eta: 4:04:39 time: 0.6093 data_time: 0.0205 memory: 15293 grad_norm: 4.5264 loss: 0.9389 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.9389 2023/05/27 14:30:32 - mmengine - INFO - Epoch(train) [24][780/940] lr: 1.0000e-03 eta: 4:04:26 time: 0.5884 data_time: 0.0083 memory: 15293 grad_norm: 4.7684 loss: 0.5828 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5828 2023/05/27 14:30:44 - mmengine - INFO - Epoch(train) [24][800/940] lr: 1.0000e-03 eta: 4:04:18 time: 0.6168 data_time: 0.0367 memory: 15293 grad_norm: 5.2003 loss: 0.7024 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7024 2023/05/27 14:30:56 - mmengine - INFO - Epoch(train) [24][820/940] lr: 1.0000e-03 eta: 4:04:09 time: 0.6133 data_time: 0.0228 memory: 15293 grad_norm: 5.2778 loss: 0.5019 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5019 2023/05/27 14:31:08 - mmengine - INFO - Epoch(train) [24][840/940] lr: 1.0000e-03 eta: 4:03:57 time: 0.5946 data_time: 0.0077 memory: 15293 grad_norm: 4.5197 loss: 0.6744 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6744 2023/05/27 14:31:20 - mmengine - INFO - Epoch(train) [24][860/940] lr: 1.0000e-03 eta: 4:03:45 time: 0.5966 data_time: 0.0190 memory: 15293 grad_norm: 3.9881 loss: 0.6397 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6397 2023/05/27 14:31:32 - mmengine - INFO - Epoch(train) [24][880/940] lr: 1.0000e-03 eta: 4:03:33 time: 0.5965 data_time: 0.0181 memory: 15293 grad_norm: 4.2727 loss: 0.7080 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7080 2023/05/27 14:31:44 - mmengine - INFO - Epoch(train) [24][900/940] lr: 1.0000e-03 eta: 4:03:24 time: 0.6149 data_time: 0.0257 memory: 15293 grad_norm: 6.7568 loss: 0.6672 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6672 2023/05/27 14:31:57 - mmengine - INFO - Epoch(train) [24][920/940] lr: 1.0000e-03 eta: 4:03:17 time: 0.6215 data_time: 0.0413 memory: 15293 grad_norm: 4.0459 loss: 0.7932 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7932 2023/05/27 14:32:08 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:32:08 - mmengine - INFO - Epoch(train) [24][940/940] lr: 1.0000e-03 eta: 4:03:00 time: 0.5683 data_time: 0.0075 memory: 15293 grad_norm: 4.6083 loss: 0.8315 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8315 2023/05/27 14:32:08 - mmengine - INFO - Saving checkpoint at 24 epochs 2023/05/27 14:32:20 - mmengine - INFO - Epoch(val) [24][20/78] eta: 0:00:18 time: 0.3157 data_time: 0.1315 memory: 2851 2023/05/27 14:32:24 - mmengine - INFO - Epoch(val) [24][40/78] eta: 0:00:10 time: 0.2354 data_time: 0.0512 memory: 2851 2023/05/27 14:32:29 - mmengine - INFO - Epoch(val) [24][60/78] eta: 0:00:04 time: 0.2023 data_time: 0.0193 memory: 2851 2023/05/27 14:34:42 - mmengine - INFO - Epoch(val) [24][78/78] acc/top1: 0.7684 acc/top5: 0.9292 acc/mean1: 0.7683 data_time: 0.0521 time: 0.2329 2023/05/27 14:34:42 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_22.pth is removed 2023/05/27 14:34:45 - mmengine - INFO - The best checkpoint with 0.7684 acc/top1 at 24 epoch is saved to best_acc_top1_epoch_24.pth. 2023/05/27 14:34:58 - mmengine - INFO - Epoch(train) [25][ 20/940] lr: 1.0000e-03 eta: 4:02:57 time: 0.6514 data_time: 0.0708 memory: 15293 grad_norm: 4.2747 loss: 0.7545 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7545 2023/05/27 14:35:10 - mmengine - INFO - Epoch(train) [25][ 40/940] lr: 1.0000e-03 eta: 4:02:44 time: 0.5897 data_time: 0.0078 memory: 15293 grad_norm: 4.3553 loss: 0.8457 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8457 2023/05/27 14:35:22 - mmengine - INFO - Epoch(train) [25][ 60/940] lr: 1.0000e-03 eta: 4:02:33 time: 0.5997 data_time: 0.0166 memory: 15293 grad_norm: 4.1003 loss: 1.1814 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 1.1814 2023/05/27 14:35:34 - mmengine - INFO - Epoch(train) [25][ 80/940] lr: 1.0000e-03 eta: 4:02:20 time: 0.5948 data_time: 0.0116 memory: 15293 grad_norm: 4.2072 loss: 0.5764 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5764 2023/05/27 14:35:46 - mmengine - INFO - Epoch(train) [25][100/940] lr: 1.0000e-03 eta: 4:02:08 time: 0.5966 data_time: 0.0119 memory: 15293 grad_norm: 4.1260 loss: 0.6414 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6414 2023/05/27 14:35:57 - mmengine - INFO - Epoch(train) [25][120/940] lr: 1.0000e-03 eta: 4:01:55 time: 0.5862 data_time: 0.0093 memory: 15293 grad_norm: 4.3970 loss: 0.9111 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.9111 2023/05/27 14:36:10 - mmengine - INFO - Epoch(train) [25][140/940] lr: 1.0000e-03 eta: 4:01:48 time: 0.6304 data_time: 0.0535 memory: 15293 grad_norm: 4.3019 loss: 0.7131 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.7131 2023/05/27 14:36:22 - mmengine - INFO - Epoch(train) [25][160/940] lr: 1.0000e-03 eta: 4:01:36 time: 0.5961 data_time: 0.0094 memory: 15293 grad_norm: 4.2204 loss: 0.6412 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6412 2023/05/27 14:36:34 - mmengine - INFO - Epoch(train) [25][180/940] lr: 1.0000e-03 eta: 4:01:26 time: 0.6114 data_time: 0.0160 memory: 15293 grad_norm: 4.0120 loss: 0.5629 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5629 2023/05/27 14:36:46 - mmengine - INFO - Epoch(train) [25][200/940] lr: 1.0000e-03 eta: 4:01:13 time: 0.5852 data_time: 0.0074 memory: 15293 grad_norm: 4.2036 loss: 0.6052 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6052 2023/05/27 14:36:58 - mmengine - INFO - Epoch(train) [25][220/940] lr: 1.0000e-03 eta: 4:00:59 time: 0.5898 data_time: 0.0090 memory: 15293 grad_norm: 4.3611 loss: 0.4176 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4176 2023/05/27 14:37:10 - mmengine - INFO - Epoch(train) [25][240/940] lr: 1.0000e-03 eta: 4:00:48 time: 0.5979 data_time: 0.0097 memory: 15293 grad_norm: 4.2665 loss: 0.6817 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6817 2023/05/27 14:37:22 - mmengine - INFO - Epoch(train) [25][260/940] lr: 1.0000e-03 eta: 4:00:36 time: 0.6007 data_time: 0.0218 memory: 15293 grad_norm: 4.7129 loss: 0.5826 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5826 2023/05/27 14:37:33 - mmengine - INFO - Epoch(train) [25][280/940] lr: 1.0000e-03 eta: 4:00:23 time: 0.5893 data_time: 0.0076 memory: 15293 grad_norm: 4.4360 loss: 0.6508 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6508 2023/05/27 14:37:45 - mmengine - INFO - Epoch(train) [25][300/940] lr: 1.0000e-03 eta: 4:00:11 time: 0.5955 data_time: 0.0078 memory: 15293 grad_norm: 4.2608 loss: 0.6450 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6450 2023/05/27 14:37:57 - mmengine - INFO - Epoch(train) [25][320/940] lr: 1.0000e-03 eta: 3:59:59 time: 0.5968 data_time: 0.0188 memory: 15293 grad_norm: 4.1454 loss: 0.5563 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5563 2023/05/27 14:38:09 - mmengine - INFO - Epoch(train) [25][340/940] lr: 1.0000e-03 eta: 3:59:47 time: 0.5945 data_time: 0.0118 memory: 15293 grad_norm: 4.8134 loss: 0.5550 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5550 2023/05/27 14:38:21 - mmengine - INFO - Epoch(train) [25][360/940] lr: 1.0000e-03 eta: 3:59:34 time: 0.5903 data_time: 0.0077 memory: 15293 grad_norm: 4.1358 loss: 0.4917 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4917 2023/05/27 14:38:33 - mmengine - INFO - Epoch(train) [25][380/940] lr: 1.0000e-03 eta: 3:59:22 time: 0.5994 data_time: 0.0180 memory: 15293 grad_norm: 4.0933 loss: 0.5964 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5964 2023/05/27 14:38:45 - mmengine - INFO - Epoch(train) [25][400/940] lr: 1.0000e-03 eta: 3:59:10 time: 0.5946 data_time: 0.0073 memory: 15293 grad_norm: 4.0907 loss: 0.6963 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6963 2023/05/27 14:38:57 - mmengine - INFO - Epoch(train) [25][420/940] lr: 1.0000e-03 eta: 3:58:57 time: 0.5926 data_time: 0.0135 memory: 15293 grad_norm: 4.1019 loss: 0.6713 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6713 2023/05/27 14:39:09 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:39:09 - mmengine - INFO - Epoch(train) [25][440/940] lr: 1.0000e-03 eta: 3:58:45 time: 0.5978 data_time: 0.0191 memory: 15293 grad_norm: 4.5961 loss: 0.9418 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.9418 2023/05/27 14:39:21 - mmengine - INFO - Epoch(train) [25][460/940] lr: 1.0000e-03 eta: 3:58:36 time: 0.6165 data_time: 0.0337 memory: 15293 grad_norm: 4.5257 loss: 0.9467 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.9467 2023/05/27 14:39:33 - mmengine - INFO - Epoch(train) [25][480/940] lr: 1.0000e-03 eta: 3:58:24 time: 0.5965 data_time: 0.0142 memory: 15293 grad_norm: 4.4438 loss: 0.6536 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6536 2023/05/27 14:39:45 - mmengine - INFO - Epoch(train) [25][500/940] lr: 1.0000e-03 eta: 3:58:15 time: 0.6125 data_time: 0.0219 memory: 15293 grad_norm: 4.5613 loss: 0.6404 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6404 2023/05/27 14:39:58 - mmengine - INFO - Epoch(train) [25][520/940] lr: 1.0000e-03 eta: 3:58:09 time: 0.6430 data_time: 0.0445 memory: 15293 grad_norm: 4.1482 loss: 0.7903 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7903 2023/05/27 14:40:10 - mmengine - INFO - Epoch(train) [25][540/940] lr: 1.0000e-03 eta: 3:57:59 time: 0.6107 data_time: 0.0301 memory: 15293 grad_norm: 4.5109 loss: 0.6796 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6796 2023/05/27 14:40:22 - mmengine - INFO - Epoch(train) [25][560/940] lr: 1.0000e-03 eta: 3:57:47 time: 0.5946 data_time: 0.0136 memory: 15293 grad_norm: 4.8095 loss: 0.7511 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7511 2023/05/27 14:40:34 - mmengine - INFO - Epoch(train) [25][580/940] lr: 1.0000e-03 eta: 3:57:35 time: 0.6013 data_time: 0.0209 memory: 15293 grad_norm: 4.5217 loss: 0.5200 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5200 2023/05/27 14:40:46 - mmengine - INFO - Epoch(train) [25][600/940] lr: 1.0000e-03 eta: 3:57:24 time: 0.5999 data_time: 0.0144 memory: 15293 grad_norm: 4.5776 loss: 0.7014 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7014 2023/05/27 14:40:58 - mmengine - INFO - Epoch(train) [25][620/940] lr: 1.0000e-03 eta: 3:57:12 time: 0.5971 data_time: 0.0193 memory: 15293 grad_norm: 4.6576 loss: 0.6959 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6959 2023/05/27 14:41:10 - mmengine - INFO - Epoch(train) [25][640/940] lr: 1.0000e-03 eta: 3:56:59 time: 0.5946 data_time: 0.0145 memory: 15293 grad_norm: 4.1589 loss: 0.4858 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4858 2023/05/27 14:41:22 - mmengine - INFO - Epoch(train) [25][660/940] lr: 1.0000e-03 eta: 3:56:50 time: 0.6122 data_time: 0.0288 memory: 15293 grad_norm: 4.8310 loss: 0.7307 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7307 2023/05/27 14:41:35 - mmengine - INFO - Epoch(train) [25][680/940] lr: 1.0000e-03 eta: 3:56:40 time: 0.6131 data_time: 0.0342 memory: 15293 grad_norm: 4.4195 loss: 0.6906 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6906 2023/05/27 14:41:46 - mmengine - INFO - Epoch(train) [25][700/940] lr: 1.0000e-03 eta: 3:56:27 time: 0.5901 data_time: 0.0076 memory: 15293 grad_norm: 4.2309 loss: 0.6100 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6100 2023/05/27 14:41:58 - mmengine - INFO - Epoch(train) [25][720/940] lr: 1.0000e-03 eta: 3:56:15 time: 0.5970 data_time: 0.0191 memory: 15293 grad_norm: 4.8339 loss: 0.7390 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7390 2023/05/27 14:42:11 - mmengine - INFO - Epoch(train) [25][740/940] lr: 1.0000e-03 eta: 3:56:07 time: 0.6264 data_time: 0.0487 memory: 15293 grad_norm: 4.2225 loss: 0.8398 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8398 2023/05/27 14:42:23 - mmengine - INFO - Epoch(train) [25][760/940] lr: 1.0000e-03 eta: 3:55:53 time: 0.5895 data_time: 0.0080 memory: 15293 grad_norm: 4.2002 loss: 0.7179 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7179 2023/05/27 14:42:34 - mmengine - INFO - Epoch(train) [25][780/940] lr: 1.0000e-03 eta: 3:55:41 time: 0.5912 data_time: 0.0077 memory: 15293 grad_norm: 4.3852 loss: 0.8139 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.8139 2023/05/27 14:42:46 - mmengine - INFO - Epoch(train) [25][800/940] lr: 1.0000e-03 eta: 3:55:27 time: 0.5853 data_time: 0.0075 memory: 15293 grad_norm: 4.4596 loss: 0.7060 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7060 2023/05/27 14:42:58 - mmengine - INFO - Epoch(train) [25][820/940] lr: 1.0000e-03 eta: 3:55:15 time: 0.5943 data_time: 0.0115 memory: 15293 grad_norm: 5.9035 loss: 0.6967 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6967 2023/05/27 14:43:10 - mmengine - INFO - Epoch(train) [25][840/940] lr: 1.0000e-03 eta: 3:55:02 time: 0.5952 data_time: 0.0155 memory: 15293 grad_norm: 4.1512 loss: 0.7789 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7789 2023/05/27 14:43:22 - mmengine - INFO - Epoch(train) [25][860/940] lr: 1.0000e-03 eta: 3:54:50 time: 0.5964 data_time: 0.0155 memory: 15293 grad_norm: 4.7874 loss: 0.6138 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6138 2023/05/27 14:43:34 - mmengine - INFO - Epoch(train) [25][880/940] lr: 1.0000e-03 eta: 3:54:39 time: 0.6017 data_time: 0.0112 memory: 15293 grad_norm: 4.6947 loss: 0.5720 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5720 2023/05/27 14:43:46 - mmengine - INFO - Epoch(train) [25][900/940] lr: 1.0000e-03 eta: 3:54:26 time: 0.5908 data_time: 0.0077 memory: 15293 grad_norm: 4.6588 loss: 0.8946 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8946 2023/05/27 14:43:58 - mmengine - INFO - Epoch(train) [25][920/940] lr: 1.0000e-03 eta: 3:54:16 time: 0.6103 data_time: 0.0295 memory: 15293 grad_norm: 5.9632 loss: 0.7974 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7974 2023/05/27 14:44:10 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:44:10 - mmengine - INFO - Epoch(train) [25][940/940] lr: 1.0000e-03 eta: 3:54:02 time: 0.5795 data_time: 0.0178 memory: 15293 grad_norm: 4.7854 loss: 0.5935 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5935 2023/05/27 14:44:16 - mmengine - INFO - Epoch(val) [25][20/78] eta: 0:00:17 time: 0.3014 data_time: 0.1166 memory: 2851 2023/05/27 14:44:21 - mmengine - INFO - Epoch(val) [25][40/78] eta: 0:00:10 time: 0.2482 data_time: 0.0631 memory: 2851 2023/05/27 14:44:25 - mmengine - INFO - Epoch(val) [25][60/78] eta: 0:00:04 time: 0.2268 data_time: 0.0422 memory: 2851 2023/05/27 14:47:23 - mmengine - INFO - Epoch(val) [25][78/78] acc/top1: 0.7686 acc/top5: 0.9290 acc/mean1: 0.7685 data_time: 0.0572 time: 0.2389 2023/05/27 14:47:23 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_24.pth is removed 2023/05/27 14:47:26 - mmengine - INFO - The best checkpoint with 0.7686 acc/top1 at 25 epoch is saved to best_acc_top1_epoch_25.pth. 2023/05/27 14:47:39 - mmengine - INFO - Epoch(train) [26][ 20/940] lr: 1.0000e-03 eta: 3:53:57 time: 0.6596 data_time: 0.0821 memory: 15293 grad_norm: 4.1765 loss: 0.7223 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7223 2023/05/27 14:47:51 - mmengine - INFO - Epoch(train) [26][ 40/940] lr: 1.0000e-03 eta: 3:53:45 time: 0.5938 data_time: 0.0186 memory: 15293 grad_norm: 4.1354 loss: 0.6450 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6450 2023/05/27 14:48:03 - mmengine - INFO - Epoch(train) [26][ 60/940] lr: 1.0000e-03 eta: 3:53:33 time: 0.5970 data_time: 0.0159 memory: 15293 grad_norm: 4.2915 loss: 0.5428 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5428 2023/05/27 14:48:15 - mmengine - INFO - Epoch(train) [26][ 80/940] lr: 1.0000e-03 eta: 3:53:23 time: 0.6135 data_time: 0.0281 memory: 15293 grad_norm: 4.6948 loss: 0.7734 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7734 2023/05/27 14:48:27 - mmengine - INFO - Epoch(train) [26][100/940] lr: 1.0000e-03 eta: 3:53:10 time: 0.5935 data_time: 0.0151 memory: 15293 grad_norm: 4.5306 loss: 0.7129 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7129 2023/05/27 14:48:39 - mmengine - INFO - Epoch(train) [26][120/940] lr: 1.0000e-03 eta: 3:52:58 time: 0.5964 data_time: 0.0145 memory: 15293 grad_norm: 4.7664 loss: 0.7502 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7502 2023/05/27 14:48:51 - mmengine - INFO - Epoch(train) [26][140/940] lr: 1.0000e-03 eta: 3:52:46 time: 0.5935 data_time: 0.0080 memory: 15293 grad_norm: 4.6688 loss: 0.5632 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5632 2023/05/27 14:49:03 - mmengine - INFO - Epoch(train) [26][160/940] lr: 1.0000e-03 eta: 3:52:34 time: 0.5998 data_time: 0.0155 memory: 15293 grad_norm: 5.4529 loss: 0.7690 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7690 2023/05/27 14:49:14 - mmengine - INFO - Epoch(train) [26][180/940] lr: 1.0000e-03 eta: 3:52:21 time: 0.5906 data_time: 0.0074 memory: 15293 grad_norm: 4.2568 loss: 0.6710 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6710 2023/05/27 14:49:26 - mmengine - INFO - Epoch(train) [26][200/940] lr: 1.0000e-03 eta: 3:52:09 time: 0.5951 data_time: 0.0075 memory: 15293 grad_norm: 4.5661 loss: 0.7453 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7453 2023/05/27 14:49:38 - mmengine - INFO - Epoch(train) [26][220/940] lr: 1.0000e-03 eta: 3:51:57 time: 0.5955 data_time: 0.0079 memory: 15293 grad_norm: 4.5234 loss: 0.7638 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7638 2023/05/27 14:49:50 - mmengine - INFO - Epoch(train) [26][240/940] lr: 1.0000e-03 eta: 3:51:45 time: 0.6015 data_time: 0.0074 memory: 15293 grad_norm: 4.4719 loss: 0.8036 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8036 2023/05/27 14:50:02 - mmengine - INFO - Epoch(train) [26][260/940] lr: 1.0000e-03 eta: 3:51:34 time: 0.6066 data_time: 0.0185 memory: 15293 grad_norm: 4.3756 loss: 0.5183 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5183 2023/05/27 14:50:14 - mmengine - INFO - Epoch(train) [26][280/940] lr: 1.0000e-03 eta: 3:51:21 time: 0.5883 data_time: 0.0076 memory: 15293 grad_norm: 4.6000 loss: 0.7829 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7829 2023/05/27 14:50:26 - mmengine - INFO - Epoch(train) [26][300/940] lr: 1.0000e-03 eta: 3:51:09 time: 0.5992 data_time: 0.0150 memory: 15293 grad_norm: 4.8134 loss: 0.8980 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.8980 2023/05/27 14:50:38 - mmengine - INFO - Epoch(train) [26][320/940] lr: 1.0000e-03 eta: 3:50:58 time: 0.6016 data_time: 0.0152 memory: 15293 grad_norm: 4.1221 loss: 0.7083 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7083 2023/05/27 14:50:50 - mmengine - INFO - Epoch(train) [26][340/940] lr: 1.0000e-03 eta: 3:50:44 time: 0.5864 data_time: 0.0081 memory: 15293 grad_norm: 4.4751 loss: 0.7531 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7531 2023/05/27 14:51:02 - mmengine - INFO - Epoch(train) [26][360/940] lr: 1.0000e-03 eta: 3:50:31 time: 0.5875 data_time: 0.0087 memory: 15293 grad_norm: 4.1395 loss: 0.6119 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6119 2023/05/27 14:51:13 - mmengine - INFO - Epoch(train) [26][380/940] lr: 1.0000e-03 eta: 3:50:19 time: 0.5911 data_time: 0.0078 memory: 15293 grad_norm: 4.2966 loss: 0.8842 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8842 2023/05/27 14:51:26 - mmengine - INFO - Epoch(train) [26][400/940] lr: 1.0000e-03 eta: 3:50:08 time: 0.6120 data_time: 0.0185 memory: 15293 grad_norm: 4.1753 loss: 0.7271 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7271 2023/05/27 14:51:37 - mmengine - INFO - Epoch(train) [26][420/940] lr: 1.0000e-03 eta: 3:49:55 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.2886 loss: 0.7927 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7927 2023/05/27 14:51:49 - mmengine - INFO - Epoch(train) [26][440/940] lr: 1.0000e-03 eta: 3:49:43 time: 0.5962 data_time: 0.0099 memory: 15293 grad_norm: 4.2847 loss: 0.5232 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5232 2023/05/27 14:52:01 - mmengine - INFO - Epoch(train) [26][460/940] lr: 1.0000e-03 eta: 3:49:30 time: 0.5910 data_time: 0.0114 memory: 15293 grad_norm: 4.0865 loss: 0.5688 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5688 2023/05/27 14:52:13 - mmengine - INFO - Epoch(train) [26][480/940] lr: 1.0000e-03 eta: 3:49:18 time: 0.5992 data_time: 0.0076 memory: 15293 grad_norm: 5.1043 loss: 0.7463 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7463 2023/05/27 14:52:25 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:52:25 - mmengine - INFO - Epoch(train) [26][500/940] lr: 1.0000e-03 eta: 3:49:06 time: 0.5966 data_time: 0.0077 memory: 15293 grad_norm: 4.5030 loss: 0.8231 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8231 2023/05/27 14:52:37 - mmengine - INFO - Epoch(train) [26][520/940] lr: 1.0000e-03 eta: 3:48:54 time: 0.5950 data_time: 0.0098 memory: 15293 grad_norm: 4.1260 loss: 0.8146 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8146 2023/05/27 14:52:49 - mmengine - INFO - Epoch(train) [26][540/940] lr: 1.0000e-03 eta: 3:48:41 time: 0.5850 data_time: 0.0074 memory: 15293 grad_norm: 4.5481 loss: 0.6509 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6509 2023/05/27 14:53:01 - mmengine - INFO - Epoch(train) [26][560/940] lr: 1.0000e-03 eta: 3:48:30 time: 0.6064 data_time: 0.0075 memory: 15293 grad_norm: 4.4593 loss: 0.5518 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5518 2023/05/27 14:53:13 - mmengine - INFO - Epoch(train) [26][580/940] lr: 1.0000e-03 eta: 3:48:19 time: 0.6104 data_time: 0.0076 memory: 15293 grad_norm: 4.8193 loss: 0.8357 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8357 2023/05/27 14:53:25 - mmengine - INFO - Epoch(train) [26][600/940] lr: 1.0000e-03 eta: 3:48:07 time: 0.5973 data_time: 0.0075 memory: 15293 grad_norm: 4.5069 loss: 0.8294 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8294 2023/05/27 14:53:37 - mmengine - INFO - Epoch(train) [26][620/940] lr: 1.0000e-03 eta: 3:47:56 time: 0.6028 data_time: 0.0190 memory: 15293 grad_norm: 4.6189 loss: 0.7146 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7146 2023/05/27 14:53:49 - mmengine - INFO - Epoch(train) [26][640/940] lr: 1.0000e-03 eta: 3:47:44 time: 0.5957 data_time: 0.0109 memory: 15293 grad_norm: 8.8293 loss: 0.7959 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7959 2023/05/27 14:54:01 - mmengine - INFO - Epoch(train) [26][660/940] lr: 1.0000e-03 eta: 3:47:30 time: 0.5856 data_time: 0.0076 memory: 15293 grad_norm: 4.4750 loss: 0.6413 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6413 2023/05/27 14:54:13 - mmengine - INFO - Epoch(train) [26][680/940] lr: 1.0000e-03 eta: 3:47:19 time: 0.6032 data_time: 0.0232 memory: 15293 grad_norm: 4.8664 loss: 0.8650 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8650 2023/05/27 14:54:25 - mmengine - INFO - Epoch(train) [26][700/940] lr: 1.0000e-03 eta: 3:47:06 time: 0.5913 data_time: 0.0083 memory: 15293 grad_norm: 4.8201 loss: 0.5661 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5661 2023/05/27 14:54:36 - mmengine - INFO - Epoch(train) [26][720/940] lr: 1.0000e-03 eta: 3:46:53 time: 0.5879 data_time: 0.0079 memory: 15293 grad_norm: 4.2044 loss: 0.5684 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5684 2023/05/27 14:54:49 - mmengine - INFO - Epoch(train) [26][740/940] lr: 1.0000e-03 eta: 3:46:43 time: 0.6133 data_time: 0.0263 memory: 15293 grad_norm: 4.1473 loss: 0.6974 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6974 2023/05/27 14:55:00 - mmengine - INFO - Epoch(train) [26][760/940] lr: 1.0000e-03 eta: 3:46:30 time: 0.5897 data_time: 0.0075 memory: 15293 grad_norm: 4.2045 loss: 0.7693 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7693 2023/05/27 14:55:13 - mmengine - INFO - Epoch(train) [26][780/940] lr: 1.0000e-03 eta: 3:46:20 time: 0.6142 data_time: 0.0340 memory: 15293 grad_norm: 4.3012 loss: 0.5835 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5835 2023/05/27 14:55:25 - mmengine - INFO - Epoch(train) [26][800/940] lr: 1.0000e-03 eta: 3:46:08 time: 0.5936 data_time: 0.0116 memory: 15293 grad_norm: 4.4105 loss: 0.6343 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6343 2023/05/27 14:55:37 - mmengine - INFO - Epoch(train) [26][820/940] lr: 1.0000e-03 eta: 3:45:56 time: 0.6028 data_time: 0.0092 memory: 15293 grad_norm: 5.8343 loss: 0.6633 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6633 2023/05/27 14:55:49 - mmengine - INFO - Epoch(train) [26][840/940] lr: 1.0000e-03 eta: 3:45:44 time: 0.5979 data_time: 0.0077 memory: 15293 grad_norm: 4.3494 loss: 0.8291 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8291 2023/05/27 14:56:00 - mmengine - INFO - Epoch(train) [26][860/940] lr: 1.0000e-03 eta: 3:45:32 time: 0.5906 data_time: 0.0093 memory: 15293 grad_norm: 4.4291 loss: 0.8205 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8205 2023/05/27 14:56:12 - mmengine - INFO - Epoch(train) [26][880/940] lr: 1.0000e-03 eta: 3:45:20 time: 0.5961 data_time: 0.0150 memory: 15293 grad_norm: 4.8155 loss: 0.5771 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5771 2023/05/27 14:56:24 - mmengine - INFO - Epoch(train) [26][900/940] lr: 1.0000e-03 eta: 3:45:07 time: 0.5950 data_time: 0.0169 memory: 15293 grad_norm: 4.1636 loss: 0.8069 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8069 2023/05/27 14:56:36 - mmengine - INFO - Epoch(train) [26][920/940] lr: 1.0000e-03 eta: 3:44:54 time: 0.5884 data_time: 0.0077 memory: 15293 grad_norm: 4.4963 loss: 0.6022 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6022 2023/05/27 14:56:47 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 14:56:47 - mmengine - INFO - Epoch(train) [26][940/940] lr: 1.0000e-03 eta: 3:44:40 time: 0.5699 data_time: 0.0084 memory: 15293 grad_norm: 4.2783 loss: 0.6508 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6508 2023/05/27 14:56:53 - mmengine - INFO - Epoch(val) [26][20/78] eta: 0:00:17 time: 0.2947 data_time: 0.1096 memory: 2851 2023/05/27 14:56:58 - mmengine - INFO - Epoch(val) [26][40/78] eta: 0:00:10 time: 0.2330 data_time: 0.0473 memory: 2851 2023/05/27 14:57:03 - mmengine - INFO - Epoch(val) [26][60/78] eta: 0:00:04 time: 0.2351 data_time: 0.0502 memory: 2851 2023/05/27 14:57:09 - mmengine - INFO - Epoch(val) [26][78/78] acc/top1: 0.7698 acc/top5: 0.9291 acc/mean1: 0.7697 data_time: 0.0538 time: 0.2359 2023/05/27 14:57:09 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_25.pth is removed 2023/05/27 14:57:10 - mmengine - INFO - The best checkpoint with 0.7698 acc/top1 at 26 epoch is saved to best_acc_top1_epoch_26.pth. 2023/05/27 14:57:24 - mmengine - INFO - Epoch(train) [27][ 20/940] lr: 1.0000e-03 eta: 3:44:35 time: 0.6710 data_time: 0.0892 memory: 15293 grad_norm: 4.1627 loss: 0.7062 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7062 2023/05/27 14:57:36 - mmengine - INFO - Epoch(train) [27][ 40/940] lr: 1.0000e-03 eta: 3:44:22 time: 0.5883 data_time: 0.0103 memory: 15293 grad_norm: 4.2096 loss: 0.6369 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6369 2023/05/27 14:57:48 - mmengine - INFO - Epoch(train) [27][ 60/940] lr: 1.0000e-03 eta: 3:44:10 time: 0.5958 data_time: 0.0153 memory: 15293 grad_norm: 4.3333 loss: 0.5936 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5936 2023/05/27 14:58:00 - mmengine - INFO - Epoch(train) [27][ 80/940] lr: 1.0000e-03 eta: 3:43:59 time: 0.6063 data_time: 0.0285 memory: 15293 grad_norm: 4.1895 loss: 0.7940 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7940 2023/05/27 14:58:12 - mmengine - INFO - Epoch(train) [27][100/940] lr: 1.0000e-03 eta: 3:43:46 time: 0.5916 data_time: 0.0080 memory: 15293 grad_norm: 4.4538 loss: 0.7169 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7169 2023/05/27 14:58:23 - mmengine - INFO - Epoch(train) [27][120/940] lr: 1.0000e-03 eta: 3:43:33 time: 0.5860 data_time: 0.0078 memory: 15293 grad_norm: 4.2905 loss: 0.5391 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5391 2023/05/27 14:58:35 - mmengine - INFO - Epoch(train) [27][140/940] lr: 1.0000e-03 eta: 3:43:21 time: 0.5910 data_time: 0.0120 memory: 15293 grad_norm: 4.5589 loss: 0.7233 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7233 2023/05/27 14:58:47 - mmengine - INFO - Epoch(train) [27][160/940] lr: 1.0000e-03 eta: 3:43:08 time: 0.5946 data_time: 0.0140 memory: 15293 grad_norm: 5.2712 loss: 0.7359 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.7359 2023/05/27 14:58:59 - mmengine - INFO - Epoch(train) [27][180/940] lr: 1.0000e-03 eta: 3:42:58 time: 0.6125 data_time: 0.0306 memory: 15293 grad_norm: 4.4561 loss: 0.5767 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5767 2023/05/27 14:59:11 - mmengine - INFO - Epoch(train) [27][200/940] lr: 1.0000e-03 eta: 3:42:45 time: 0.5872 data_time: 0.0077 memory: 15293 grad_norm: 4.4372 loss: 0.7666 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7666 2023/05/27 14:59:23 - mmengine - INFO - Epoch(train) [27][220/940] lr: 1.0000e-03 eta: 3:42:32 time: 0.5936 data_time: 0.0125 memory: 15293 grad_norm: 7.4818 loss: 0.5134 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5134 2023/05/27 14:59:35 - mmengine - INFO - Epoch(train) [27][240/940] lr: 1.0000e-03 eta: 3:42:20 time: 0.5921 data_time: 0.0117 memory: 15293 grad_norm: 4.3651 loss: 0.7754 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7754 2023/05/27 14:59:46 - mmengine - INFO - Epoch(train) [27][260/940] lr: 1.0000e-03 eta: 3:42:07 time: 0.5897 data_time: 0.0104 memory: 15293 grad_norm: 4.5830 loss: 0.5686 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5686 2023/05/27 14:59:59 - mmengine - INFO - Epoch(train) [27][280/940] lr: 1.0000e-03 eta: 3:41:57 time: 0.6208 data_time: 0.0427 memory: 15293 grad_norm: 4.6721 loss: 0.5654 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5654 2023/05/27 15:00:11 - mmengine - INFO - Epoch(train) [27][300/940] lr: 1.0000e-03 eta: 3:41:44 time: 0.5863 data_time: 0.0074 memory: 15293 grad_norm: 4.3789 loss: 0.6427 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6427 2023/05/27 15:00:22 - mmengine - INFO - Epoch(train) [27][320/940] lr: 1.0000e-03 eta: 3:41:32 time: 0.5875 data_time: 0.0074 memory: 15293 grad_norm: 4.5957 loss: 0.7746 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7746 2023/05/27 15:00:34 - mmengine - INFO - Epoch(train) [27][340/940] lr: 1.0000e-03 eta: 3:41:19 time: 0.5856 data_time: 0.0078 memory: 15293 grad_norm: 4.2660 loss: 0.5319 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5319 2023/05/27 15:00:46 - mmengine - INFO - Epoch(train) [27][360/940] lr: 1.0000e-03 eta: 3:41:06 time: 0.5868 data_time: 0.0076 memory: 15293 grad_norm: 4.3604 loss: 0.4701 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4701 2023/05/27 15:00:58 - mmengine - INFO - Epoch(train) [27][380/940] lr: 1.0000e-03 eta: 3:40:53 time: 0.5892 data_time: 0.0074 memory: 15293 grad_norm: 4.6564 loss: 0.8887 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8887 2023/05/27 15:01:09 - mmengine - INFO - Epoch(train) [27][400/940] lr: 1.0000e-03 eta: 3:40:40 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.6677 loss: 0.7722 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7722 2023/05/27 15:01:21 - mmengine - INFO - Epoch(train) [27][420/940] lr: 1.0000e-03 eta: 3:40:28 time: 0.5934 data_time: 0.0079 memory: 15293 grad_norm: 4.7020 loss: 0.6648 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6648 2023/05/27 15:01:33 - mmengine - INFO - Epoch(train) [27][440/940] lr: 1.0000e-03 eta: 3:40:16 time: 0.6001 data_time: 0.0073 memory: 15293 grad_norm: 4.1830 loss: 0.7227 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7227 2023/05/27 15:01:45 - mmengine - INFO - Epoch(train) [27][460/940] lr: 1.0000e-03 eta: 3:40:04 time: 0.5944 data_time: 0.0076 memory: 15293 grad_norm: 4.4261 loss: 0.5390 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5390 2023/05/27 15:01:57 - mmengine - INFO - Epoch(train) [27][480/940] lr: 1.0000e-03 eta: 3:39:51 time: 0.5856 data_time: 0.0075 memory: 15293 grad_norm: 4.3706 loss: 0.4350 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4350 2023/05/27 15:02:09 - mmengine - INFO - Epoch(train) [27][500/940] lr: 1.0000e-03 eta: 3:39:39 time: 0.5939 data_time: 0.0075 memory: 15293 grad_norm: 4.3403 loss: 0.5902 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5902 2023/05/27 15:02:20 - mmengine - INFO - Epoch(train) [27][520/940] lr: 1.0000e-03 eta: 3:39:26 time: 0.5886 data_time: 0.0074 memory: 15293 grad_norm: 4.2493 loss: 0.6271 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6271 2023/05/27 15:02:32 - mmengine - INFO - Epoch(train) [27][540/940] lr: 1.0000e-03 eta: 3:39:13 time: 0.5863 data_time: 0.0077 memory: 15293 grad_norm: 4.8548 loss: 0.7654 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7654 2023/05/27 15:02:44 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:02:44 - mmengine - INFO - Epoch(train) [27][560/940] lr: 1.0000e-03 eta: 3:39:00 time: 0.5855 data_time: 0.0075 memory: 15293 grad_norm: 4.3217 loss: 0.6087 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6087 2023/05/27 15:02:56 - mmengine - INFO - Epoch(train) [27][580/940] lr: 1.0000e-03 eta: 3:38:48 time: 0.5997 data_time: 0.0074 memory: 15293 grad_norm: 4.5428 loss: 0.5596 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5596 2023/05/27 15:03:08 - mmengine - INFO - Epoch(train) [27][600/940] lr: 1.0000e-03 eta: 3:38:36 time: 0.5878 data_time: 0.0078 memory: 15293 grad_norm: 4.4020 loss: 0.7525 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7525 2023/05/27 15:03:19 - mmengine - INFO - Epoch(train) [27][620/940] lr: 1.0000e-03 eta: 3:38:23 time: 0.5908 data_time: 0.0076 memory: 15293 grad_norm: 4.6306 loss: 0.6856 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6856 2023/05/27 15:03:31 - mmengine - INFO - Epoch(train) [27][640/940] lr: 1.0000e-03 eta: 3:38:11 time: 0.5945 data_time: 0.0075 memory: 15293 grad_norm: 4.3305 loss: 0.5787 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5787 2023/05/27 15:03:43 - mmengine - INFO - Epoch(train) [27][660/940] lr: 1.0000e-03 eta: 3:37:59 time: 0.5950 data_time: 0.0077 memory: 15293 grad_norm: 4.3268 loss: 0.6463 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6463 2023/05/27 15:03:55 - mmengine - INFO - Epoch(train) [27][680/940] lr: 1.0000e-03 eta: 3:37:47 time: 0.5947 data_time: 0.0073 memory: 15293 grad_norm: 4.6622 loss: 0.6775 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6775 2023/05/27 15:04:07 - mmengine - INFO - Epoch(train) [27][700/940] lr: 1.0000e-03 eta: 3:37:34 time: 0.5885 data_time: 0.0076 memory: 15293 grad_norm: 5.2295 loss: 0.7219 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7219 2023/05/27 15:04:19 - mmengine - INFO - Epoch(train) [27][720/940] lr: 1.0000e-03 eta: 3:37:22 time: 0.5924 data_time: 0.0076 memory: 15293 grad_norm: 4.2010 loss: 0.7761 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7761 2023/05/27 15:04:31 - mmengine - INFO - Epoch(train) [27][740/940] lr: 1.0000e-03 eta: 3:37:10 time: 0.5962 data_time: 0.0075 memory: 15293 grad_norm: 4.3071 loss: 0.7332 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7332 2023/05/27 15:04:43 - mmengine - INFO - Epoch(train) [27][760/940] lr: 1.0000e-03 eta: 3:36:57 time: 0.5944 data_time: 0.0075 memory: 15293 grad_norm: 4.2621 loss: 0.6550 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6550 2023/05/27 15:04:55 - mmengine - INFO - Epoch(train) [27][780/940] lr: 1.0000e-03 eta: 3:36:46 time: 0.6040 data_time: 0.0077 memory: 15293 grad_norm: 4.3692 loss: 0.7923 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7923 2023/05/27 15:05:07 - mmengine - INFO - Epoch(train) [27][800/940] lr: 1.0000e-03 eta: 3:36:35 time: 0.6024 data_time: 0.0074 memory: 15293 grad_norm: 4.5022 loss: 0.6879 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6879 2023/05/27 15:05:19 - mmengine - INFO - Epoch(train) [27][820/940] lr: 1.0000e-03 eta: 3:36:23 time: 0.5967 data_time: 0.0076 memory: 15293 grad_norm: 4.1359 loss: 0.6030 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6030 2023/05/27 15:05:31 - mmengine - INFO - Epoch(train) [27][840/940] lr: 1.0000e-03 eta: 3:36:11 time: 0.5985 data_time: 0.0076 memory: 15293 grad_norm: 4.3046 loss: 0.6550 top1_acc: 0.3750 top5_acc: 0.8750 loss_cls: 0.6550 2023/05/27 15:05:43 - mmengine - INFO - Epoch(train) [27][860/940] lr: 1.0000e-03 eta: 3:35:59 time: 0.5970 data_time: 0.0078 memory: 15293 grad_norm: 4.5021 loss: 0.8129 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.8129 2023/05/27 15:05:54 - mmengine - INFO - Epoch(train) [27][880/940] lr: 1.0000e-03 eta: 3:35:47 time: 0.5940 data_time: 0.0074 memory: 15293 grad_norm: 4.7716 loss: 0.6979 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6979 2023/05/27 15:06:06 - mmengine - INFO - Epoch(train) [27][900/940] lr: 1.0000e-03 eta: 3:35:34 time: 0.5859 data_time: 0.0078 memory: 15293 grad_norm: 4.4526 loss: 0.7001 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7001 2023/05/27 15:06:18 - mmengine - INFO - Epoch(train) [27][920/940] lr: 1.0000e-03 eta: 3:35:21 time: 0.5918 data_time: 0.0075 memory: 15293 grad_norm: 4.0631 loss: 0.6148 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6148 2023/05/27 15:06:30 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:06:30 - mmengine - INFO - Epoch(train) [27][940/940] lr: 1.0000e-03 eta: 3:35:09 time: 0.5868 data_time: 0.0072 memory: 15293 grad_norm: 4.5961 loss: 0.5775 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5775 2023/05/27 15:06:30 - mmengine - INFO - Saving checkpoint at 27 epochs 2023/05/27 15:06:38 - mmengine - INFO - Epoch(val) [27][20/78] eta: 0:00:17 time: 0.3060 data_time: 0.1216 memory: 2851 2023/05/27 15:06:43 - mmengine - INFO - Epoch(val) [27][40/78] eta: 0:00:10 time: 0.2233 data_time: 0.0383 memory: 2851 2023/05/27 15:06:47 - mmengine - INFO - Epoch(val) [27][60/78] eta: 0:00:04 time: 0.2160 data_time: 0.0322 memory: 2851 2023/05/27 15:06:59 - mmengine - INFO - Epoch(val) [27][78/78] acc/top1: 0.7710 acc/top5: 0.9299 acc/mean1: 0.7709 data_time: 0.0496 time: 0.2310 2023/05/27 15:06:59 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_26.pth is removed 2023/05/27 15:07:01 - mmengine - INFO - The best checkpoint with 0.7710 acc/top1 at 27 epoch is saved to best_acc_top1_epoch_27.pth. 2023/05/27 15:07:14 - mmengine - INFO - Epoch(train) [28][ 20/940] lr: 1.0000e-03 eta: 3:35:02 time: 0.6667 data_time: 0.0837 memory: 15293 grad_norm: 4.7544 loss: 0.7211 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7211 2023/05/27 15:07:26 - mmengine - INFO - Epoch(train) [28][ 40/940] lr: 1.0000e-03 eta: 3:34:49 time: 0.5835 data_time: 0.0074 memory: 15293 grad_norm: 4.1772 loss: 0.7540 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7540 2023/05/27 15:07:38 - mmengine - INFO - Epoch(train) [28][ 60/940] lr: 1.0000e-03 eta: 3:34:37 time: 0.5948 data_time: 0.0075 memory: 15293 grad_norm: 4.4231 loss: 0.7963 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7963 2023/05/27 15:07:50 - mmengine - INFO - Epoch(train) [28][ 80/940] lr: 1.0000e-03 eta: 3:34:25 time: 0.5980 data_time: 0.0075 memory: 15293 grad_norm: 5.8331 loss: 0.6650 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6650 2023/05/27 15:08:02 - mmengine - INFO - Epoch(train) [28][100/940] lr: 1.0000e-03 eta: 3:34:12 time: 0.5862 data_time: 0.0078 memory: 15293 grad_norm: 4.7311 loss: 0.7612 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7612 2023/05/27 15:08:13 - mmengine - INFO - Epoch(train) [28][120/940] lr: 1.0000e-03 eta: 3:34:00 time: 0.5934 data_time: 0.0078 memory: 15293 grad_norm: 4.6160 loss: 0.6518 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6518 2023/05/27 15:08:25 - mmengine - INFO - Epoch(train) [28][140/940] lr: 1.0000e-03 eta: 3:33:48 time: 0.5936 data_time: 0.0076 memory: 15293 grad_norm: 4.4713 loss: 0.5638 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5638 2023/05/27 15:08:37 - mmengine - INFO - Epoch(train) [28][160/940] lr: 1.0000e-03 eta: 3:33:36 time: 0.5943 data_time: 0.0077 memory: 15293 grad_norm: 4.2273 loss: 0.6452 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6452 2023/05/27 15:08:49 - mmengine - INFO - Epoch(train) [28][180/940] lr: 1.0000e-03 eta: 3:33:23 time: 0.5880 data_time: 0.0077 memory: 15293 grad_norm: 4.2456 loss: 0.7515 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7515 2023/05/27 15:09:01 - mmengine - INFO - Epoch(train) [28][200/940] lr: 1.0000e-03 eta: 3:33:11 time: 0.5955 data_time: 0.0075 memory: 15293 grad_norm: 4.4070 loss: 0.6860 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6860 2023/05/27 15:09:13 - mmengine - INFO - Epoch(train) [28][220/940] lr: 1.0000e-03 eta: 3:32:59 time: 0.5914 data_time: 0.0077 memory: 15293 grad_norm: 4.3150 loss: 0.7793 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7793 2023/05/27 15:09:24 - mmengine - INFO - Epoch(train) [28][240/940] lr: 1.0000e-03 eta: 3:32:46 time: 0.5890 data_time: 0.0075 memory: 15293 grad_norm: 4.8830 loss: 0.6409 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6409 2023/05/27 15:09:36 - mmengine - INFO - Epoch(train) [28][260/940] lr: 1.0000e-03 eta: 3:32:34 time: 0.5953 data_time: 0.0076 memory: 15293 grad_norm: 4.5493 loss: 0.6646 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6646 2023/05/27 15:09:48 - mmengine - INFO - Epoch(train) [28][280/940] lr: 1.0000e-03 eta: 3:32:21 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.4357 loss: 0.6148 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6148 2023/05/27 15:10:00 - mmengine - INFO - Epoch(train) [28][300/940] lr: 1.0000e-03 eta: 3:32:09 time: 0.5891 data_time: 0.0081 memory: 15293 grad_norm: 4.7243 loss: 0.5993 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5993 2023/05/27 15:10:12 - mmengine - INFO - Epoch(train) [28][320/940] lr: 1.0000e-03 eta: 3:31:56 time: 0.5916 data_time: 0.0074 memory: 15293 grad_norm: 4.3517 loss: 0.5306 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5306 2023/05/27 15:10:23 - mmengine - INFO - Epoch(train) [28][340/940] lr: 1.0000e-03 eta: 3:31:44 time: 0.5862 data_time: 0.0077 memory: 15293 grad_norm: 4.1922 loss: 0.7156 top1_acc: 0.5000 top5_acc: 0.6250 loss_cls: 0.7156 2023/05/27 15:10:35 - mmengine - INFO - Epoch(train) [28][360/940] lr: 1.0000e-03 eta: 3:31:31 time: 0.5922 data_time: 0.0075 memory: 15293 grad_norm: 5.1801 loss: 0.7436 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7436 2023/05/27 15:10:47 - mmengine - INFO - Epoch(train) [28][380/940] lr: 1.0000e-03 eta: 3:31:19 time: 0.5894 data_time: 0.0075 memory: 15293 grad_norm: 4.2997 loss: 0.5796 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5796 2023/05/27 15:10:59 - mmengine - INFO - Epoch(train) [28][400/940] lr: 1.0000e-03 eta: 3:31:06 time: 0.5860 data_time: 0.0074 memory: 15293 grad_norm: 4.0569 loss: 0.6346 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6346 2023/05/27 15:11:11 - mmengine - INFO - Epoch(train) [28][420/940] lr: 1.0000e-03 eta: 3:30:53 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.1947 loss: 0.6561 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6561 2023/05/27 15:11:22 - mmengine - INFO - Epoch(train) [28][440/940] lr: 1.0000e-03 eta: 3:30:41 time: 0.5885 data_time: 0.0074 memory: 15293 grad_norm: 3.9810 loss: 0.6288 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6288 2023/05/27 15:11:34 - mmengine - INFO - Epoch(train) [28][460/940] lr: 1.0000e-03 eta: 3:30:29 time: 0.5939 data_time: 0.0075 memory: 15293 grad_norm: 4.2826 loss: 0.7170 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7170 2023/05/27 15:11:46 - mmengine - INFO - Epoch(train) [28][480/940] lr: 1.0000e-03 eta: 3:30:16 time: 0.5880 data_time: 0.0076 memory: 15293 grad_norm: 4.2810 loss: 0.5292 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5292 2023/05/27 15:11:58 - mmengine - INFO - Epoch(train) [28][500/940] lr: 1.0000e-03 eta: 3:30:04 time: 0.5908 data_time: 0.0076 memory: 15293 grad_norm: 4.2690 loss: 0.8286 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8286 2023/05/27 15:12:10 - mmengine - INFO - Epoch(train) [28][520/940] lr: 1.0000e-03 eta: 3:29:52 time: 0.5928 data_time: 0.0076 memory: 15293 grad_norm: 4.1359 loss: 0.5044 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5044 2023/05/27 15:12:21 - mmengine - INFO - Epoch(train) [28][540/940] lr: 1.0000e-03 eta: 3:29:39 time: 0.5861 data_time: 0.0076 memory: 15293 grad_norm: 4.4953 loss: 0.7423 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.7423 2023/05/27 15:12:33 - mmengine - INFO - Epoch(train) [28][560/940] lr: 1.0000e-03 eta: 3:29:26 time: 0.5879 data_time: 0.0075 memory: 15293 grad_norm: 4.1048 loss: 0.6818 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6818 2023/05/27 15:12:45 - mmengine - INFO - Epoch(train) [28][580/940] lr: 1.0000e-03 eta: 3:29:14 time: 0.5924 data_time: 0.0074 memory: 15293 grad_norm: 4.7557 loss: 0.6254 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6254 2023/05/27 15:12:57 - mmengine - INFO - Epoch(train) [28][600/940] lr: 1.0000e-03 eta: 3:29:02 time: 0.5859 data_time: 0.0076 memory: 15293 grad_norm: 4.5953 loss: 0.5815 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5815 2023/05/27 15:13:08 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:13:08 - mmengine - INFO - Epoch(train) [28][620/940] lr: 1.0000e-03 eta: 3:28:49 time: 0.5912 data_time: 0.0075 memory: 15293 grad_norm: 4.3114 loss: 0.8207 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8207 2023/05/27 15:13:20 - mmengine - INFO - Epoch(train) [28][640/940] lr: 1.0000e-03 eta: 3:28:37 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.7675 loss: 0.7277 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7277 2023/05/27 15:13:32 - mmengine - INFO - Epoch(train) [28][660/940] lr: 1.0000e-03 eta: 3:28:25 time: 0.5936 data_time: 0.0078 memory: 15293 grad_norm: 4.6411 loss: 0.5583 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5583 2023/05/27 15:13:44 - mmengine - INFO - Epoch(train) [28][680/940] lr: 1.0000e-03 eta: 3:28:13 time: 0.5955 data_time: 0.0077 memory: 15293 grad_norm: 4.7476 loss: 0.6001 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6001 2023/05/27 15:13:56 - mmengine - INFO - Epoch(train) [28][700/940] lr: 1.0000e-03 eta: 3:28:00 time: 0.5862 data_time: 0.0077 memory: 15293 grad_norm: 4.5994 loss: 0.6731 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6731 2023/05/27 15:14:08 - mmengine - INFO - Epoch(train) [28][720/940] lr: 1.0000e-03 eta: 3:27:48 time: 0.5897 data_time: 0.0076 memory: 15293 grad_norm: 4.8098 loss: 0.5967 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5967 2023/05/27 15:14:20 - mmengine - INFO - Epoch(train) [28][740/940] lr: 1.0000e-03 eta: 3:27:36 time: 0.5960 data_time: 0.0077 memory: 15293 grad_norm: 4.3983 loss: 0.6657 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6657 2023/05/27 15:14:31 - mmengine - INFO - Epoch(train) [28][760/940] lr: 1.0000e-03 eta: 3:27:23 time: 0.5896 data_time: 0.0074 memory: 15293 grad_norm: 5.4531 loss: 0.6441 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6441 2023/05/27 15:14:43 - mmengine - INFO - Epoch(train) [28][780/940] lr: 1.0000e-03 eta: 3:27:11 time: 0.5866 data_time: 0.0077 memory: 15293 grad_norm: 4.4828 loss: 0.8030 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8030 2023/05/27 15:14:55 - mmengine - INFO - Epoch(train) [28][800/940] lr: 1.0000e-03 eta: 3:26:59 time: 0.5955 data_time: 0.0074 memory: 15293 grad_norm: 4.1632 loss: 0.7531 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7531 2023/05/27 15:15:07 - mmengine - INFO - Epoch(train) [28][820/940] lr: 1.0000e-03 eta: 3:26:46 time: 0.5896 data_time: 0.0077 memory: 15293 grad_norm: 4.2919 loss: 0.5355 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5355 2023/05/27 15:15:19 - mmengine - INFO - Epoch(train) [28][840/940] lr: 1.0000e-03 eta: 3:26:35 time: 0.5964 data_time: 0.0074 memory: 15293 grad_norm: 4.1993 loss: 0.7715 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7715 2023/05/27 15:15:30 - mmengine - INFO - Epoch(train) [28][860/940] lr: 1.0000e-03 eta: 3:26:22 time: 0.5897 data_time: 0.0077 memory: 15293 grad_norm: 4.3442 loss: 0.6183 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6183 2023/05/27 15:15:42 - mmengine - INFO - Epoch(train) [28][880/940] lr: 1.0000e-03 eta: 3:26:10 time: 0.5946 data_time: 0.0080 memory: 15293 grad_norm: 4.3420 loss: 0.9236 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.9236 2023/05/27 15:15:54 - mmengine - INFO - Epoch(train) [28][900/940] lr: 1.0000e-03 eta: 3:25:58 time: 0.5905 data_time: 0.0077 memory: 15293 grad_norm: 4.5755 loss: 0.4807 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4807 2023/05/27 15:16:06 - mmengine - INFO - Epoch(train) [28][920/940] lr: 1.0000e-03 eta: 3:25:46 time: 0.5934 data_time: 0.0077 memory: 15293 grad_norm: 4.3778 loss: 0.7044 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7044 2023/05/27 15:16:18 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:16:18 - mmengine - INFO - Epoch(train) [28][940/940] lr: 1.0000e-03 eta: 3:25:32 time: 0.5737 data_time: 0.0074 memory: 15293 grad_norm: 4.7306 loss: 0.5715 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5715 2023/05/27 15:16:23 - mmengine - INFO - Epoch(val) [28][20/78] eta: 0:00:17 time: 0.2968 data_time: 0.1122 memory: 2851 2023/05/27 15:16:28 - mmengine - INFO - Epoch(val) [28][40/78] eta: 0:00:10 time: 0.2326 data_time: 0.0476 memory: 2851 2023/05/27 15:16:33 - mmengine - INFO - Epoch(val) [28][60/78] eta: 0:00:04 time: 0.2245 data_time: 0.0398 memory: 2851 2023/05/27 15:16:38 - mmengine - INFO - Epoch(val) [28][78/78] acc/top1: 0.7711 acc/top5: 0.9293 acc/mean1: 0.7709 data_time: 0.0516 time: 0.2333 2023/05/27 15:16:38 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_27.pth is removed 2023/05/27 15:16:40 - mmengine - INFO - The best checkpoint with 0.7711 acc/top1 at 28 epoch is saved to best_acc_top1_epoch_28.pth. 2023/05/27 15:16:53 - mmengine - INFO - Epoch(train) [29][ 20/940] lr: 1.0000e-03 eta: 3:25:24 time: 0.6559 data_time: 0.0730 memory: 15293 grad_norm: 4.2069 loss: 0.6992 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6992 2023/05/27 15:17:05 - mmengine - INFO - Epoch(train) [29][ 40/940] lr: 1.0000e-03 eta: 3:25:12 time: 0.5889 data_time: 0.0072 memory: 15293 grad_norm: 5.7202 loss: 0.5669 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5669 2023/05/27 15:17:16 - mmengine - INFO - Epoch(train) [29][ 60/940] lr: 1.0000e-03 eta: 3:25:00 time: 0.5920 data_time: 0.0074 memory: 15293 grad_norm: 4.8067 loss: 0.5451 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5451 2023/05/27 15:17:28 - mmengine - INFO - Epoch(train) [29][ 80/940] lr: 1.0000e-03 eta: 3:24:47 time: 0.5923 data_time: 0.0076 memory: 15293 grad_norm: 4.1571 loss: 0.6621 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6621 2023/05/27 15:17:40 - mmengine - INFO - Epoch(train) [29][100/940] lr: 1.0000e-03 eta: 3:24:35 time: 0.5908 data_time: 0.0075 memory: 15293 grad_norm: 4.2300 loss: 0.7750 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7750 2023/05/27 15:17:52 - mmengine - INFO - Epoch(train) [29][120/940] lr: 1.0000e-03 eta: 3:24:23 time: 0.5894 data_time: 0.0073 memory: 15293 grad_norm: 4.6122 loss: 0.5992 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5992 2023/05/27 15:18:04 - mmengine - INFO - Epoch(train) [29][140/940] lr: 1.0000e-03 eta: 3:24:11 time: 0.5926 data_time: 0.0075 memory: 15293 grad_norm: 4.3053 loss: 0.5689 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5689 2023/05/27 15:18:16 - mmengine - INFO - Epoch(train) [29][160/940] lr: 1.0000e-03 eta: 3:23:58 time: 0.5910 data_time: 0.0073 memory: 15293 grad_norm: 4.3597 loss: 0.4224 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4224 2023/05/27 15:18:27 - mmengine - INFO - Epoch(train) [29][180/940] lr: 1.0000e-03 eta: 3:23:46 time: 0.5862 data_time: 0.0075 memory: 15293 grad_norm: 4.5002 loss: 0.7705 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.7705 2023/05/27 15:18:39 - mmengine - INFO - Epoch(train) [29][200/940] lr: 1.0000e-03 eta: 3:23:34 time: 0.5968 data_time: 0.0075 memory: 15293 grad_norm: 4.4522 loss: 0.6511 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6511 2023/05/27 15:18:51 - mmengine - INFO - Epoch(train) [29][220/940] lr: 1.0000e-03 eta: 3:23:21 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.3892 loss: 0.5909 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5909 2023/05/27 15:19:03 - mmengine - INFO - Epoch(train) [29][240/940] lr: 1.0000e-03 eta: 3:23:09 time: 0.5884 data_time: 0.0075 memory: 15293 grad_norm: 6.1020 loss: 0.6453 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6453 2023/05/27 15:19:14 - mmengine - INFO - Epoch(train) [29][260/940] lr: 1.0000e-03 eta: 3:22:56 time: 0.5878 data_time: 0.0077 memory: 15293 grad_norm: 4.3228 loss: 0.9303 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.9303 2023/05/27 15:19:26 - mmengine - INFO - Epoch(train) [29][280/940] lr: 1.0000e-03 eta: 3:22:44 time: 0.5879 data_time: 0.0073 memory: 15293 grad_norm: 4.2916 loss: 0.6131 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6131 2023/05/27 15:19:38 - mmengine - INFO - Epoch(train) [29][300/940] lr: 1.0000e-03 eta: 3:22:32 time: 0.5933 data_time: 0.0075 memory: 15293 grad_norm: 4.2276 loss: 0.7054 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7054 2023/05/27 15:19:50 - mmengine - INFO - Epoch(train) [29][320/940] lr: 1.0000e-03 eta: 3:22:20 time: 0.5938 data_time: 0.0072 memory: 15293 grad_norm: 4.8591 loss: 0.4209 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4209 2023/05/27 15:20:02 - mmengine - INFO - Epoch(train) [29][340/940] lr: 1.0000e-03 eta: 3:22:08 time: 0.5959 data_time: 0.0075 memory: 15293 grad_norm: 4.2812 loss: 0.6588 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6588 2023/05/27 15:20:14 - mmengine - INFO - Epoch(train) [29][360/940] lr: 1.0000e-03 eta: 3:21:56 time: 0.5940 data_time: 0.0074 memory: 15293 grad_norm: 4.3436 loss: 0.5416 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5416 2023/05/27 15:20:26 - mmengine - INFO - Epoch(train) [29][380/940] lr: 1.0000e-03 eta: 3:21:44 time: 0.5918 data_time: 0.0077 memory: 15293 grad_norm: 5.8666 loss: 0.4790 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4790 2023/05/27 15:20:37 - mmengine - INFO - Epoch(train) [29][400/940] lr: 1.0000e-03 eta: 3:21:31 time: 0.5858 data_time: 0.0075 memory: 15293 grad_norm: 4.7881 loss: 0.8405 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8405 2023/05/27 15:20:49 - mmengine - INFO - Epoch(train) [29][420/940] lr: 1.0000e-03 eta: 3:21:19 time: 0.5885 data_time: 0.0077 memory: 15293 grad_norm: 4.2089 loss: 0.7383 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7383 2023/05/27 15:21:01 - mmengine - INFO - Epoch(train) [29][440/940] lr: 1.0000e-03 eta: 3:21:07 time: 0.5991 data_time: 0.0073 memory: 15293 grad_norm: 4.4580 loss: 0.5947 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5947 2023/05/27 15:21:13 - mmengine - INFO - Epoch(train) [29][460/940] lr: 1.0000e-03 eta: 3:20:55 time: 0.5917 data_time: 0.0076 memory: 15293 grad_norm: 4.9382 loss: 0.7659 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7659 2023/05/27 15:21:25 - mmengine - INFO - Epoch(train) [29][480/940] lr: 1.0000e-03 eta: 3:20:43 time: 0.5972 data_time: 0.0073 memory: 15293 grad_norm: 4.4973 loss: 0.6454 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6454 2023/05/27 15:21:37 - mmengine - INFO - Epoch(train) [29][500/940] lr: 1.0000e-03 eta: 3:20:30 time: 0.5872 data_time: 0.0075 memory: 15293 grad_norm: 4.0767 loss: 0.6687 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6687 2023/05/27 15:21:48 - mmengine - INFO - Epoch(train) [29][520/940] lr: 1.0000e-03 eta: 3:20:18 time: 0.5899 data_time: 0.0074 memory: 15293 grad_norm: 4.3089 loss: 0.6193 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6193 2023/05/27 15:22:00 - mmengine - INFO - Epoch(train) [29][540/940] lr: 1.0000e-03 eta: 3:20:06 time: 0.5893 data_time: 0.0076 memory: 15293 grad_norm: 4.1898 loss: 0.6668 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6668 2023/05/27 15:22:12 - mmengine - INFO - Epoch(train) [29][560/940] lr: 1.0000e-03 eta: 3:19:54 time: 0.5972 data_time: 0.0075 memory: 15293 grad_norm: 4.5917 loss: 0.7179 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.7179 2023/05/27 15:22:24 - mmengine - INFO - Epoch(train) [29][580/940] lr: 1.0000e-03 eta: 3:19:42 time: 0.5864 data_time: 0.0076 memory: 15293 grad_norm: 4.1874 loss: 0.7788 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7788 2023/05/27 15:22:36 - mmengine - INFO - Epoch(train) [29][600/940] lr: 1.0000e-03 eta: 3:19:29 time: 0.5921 data_time: 0.0077 memory: 15293 grad_norm: 4.1629 loss: 0.7374 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7374 2023/05/27 15:22:47 - mmengine - INFO - Epoch(train) [29][620/940] lr: 1.0000e-03 eta: 3:19:17 time: 0.5877 data_time: 0.0078 memory: 15293 grad_norm: 4.2766 loss: 0.6514 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6514 2023/05/27 15:22:59 - mmengine - INFO - Epoch(train) [29][640/940] lr: 1.0000e-03 eta: 3:19:05 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 4.3742 loss: 0.5277 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5277 2023/05/27 15:23:11 - mmengine - INFO - Epoch(train) [29][660/940] lr: 1.0000e-03 eta: 3:18:52 time: 0.5879 data_time: 0.0076 memory: 15293 grad_norm: 4.5154 loss: 0.5712 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5712 2023/05/27 15:23:23 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:23:23 - mmengine - INFO - Epoch(train) [29][680/940] lr: 1.0000e-03 eta: 3:18:40 time: 0.5919 data_time: 0.0075 memory: 15293 grad_norm: 4.6569 loss: 0.7135 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7135 2023/05/27 15:23:35 - mmengine - INFO - Epoch(train) [29][700/940] lr: 1.0000e-03 eta: 3:18:28 time: 0.5929 data_time: 0.0076 memory: 15293 grad_norm: 4.4040 loss: 0.5533 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5533 2023/05/27 15:23:47 - mmengine - INFO - Epoch(train) [29][720/940] lr: 1.0000e-03 eta: 3:18:16 time: 0.5970 data_time: 0.0076 memory: 15293 grad_norm: 4.6018 loss: 0.5578 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5578 2023/05/27 15:23:59 - mmengine - INFO - Epoch(train) [29][740/940] lr: 1.0000e-03 eta: 3:18:04 time: 0.5937 data_time: 0.0076 memory: 15293 grad_norm: 4.5273 loss: 0.6974 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6974 2023/05/27 15:24:10 - mmengine - INFO - Epoch(train) [29][760/940] lr: 1.0000e-03 eta: 3:17:52 time: 0.5859 data_time: 0.0074 memory: 15293 grad_norm: 4.7418 loss: 0.6198 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6198 2023/05/27 15:24:22 - mmengine - INFO - Epoch(train) [29][780/940] lr: 1.0000e-03 eta: 3:17:40 time: 0.5925 data_time: 0.0079 memory: 15293 grad_norm: 4.5503 loss: 0.5526 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5526 2023/05/27 15:24:34 - mmengine - INFO - Epoch(train) [29][800/940] lr: 1.0000e-03 eta: 3:17:27 time: 0.5903 data_time: 0.0076 memory: 15293 grad_norm: 4.4362 loss: 0.5494 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5494 2023/05/27 15:24:46 - mmengine - INFO - Epoch(train) [29][820/940] lr: 1.0000e-03 eta: 3:17:15 time: 0.5959 data_time: 0.0075 memory: 15293 grad_norm: 4.9744 loss: 0.6055 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6055 2023/05/27 15:24:58 - mmengine - INFO - Epoch(train) [29][840/940] lr: 1.0000e-03 eta: 3:17:04 time: 0.5981 data_time: 0.0077 memory: 15293 grad_norm: 4.3766 loss: 0.6836 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6836 2023/05/27 15:25:10 - mmengine - INFO - Epoch(train) [29][860/940] lr: 1.0000e-03 eta: 3:16:51 time: 0.5871 data_time: 0.0076 memory: 15293 grad_norm: 4.9727 loss: 0.4871 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4871 2023/05/27 15:25:21 - mmengine - INFO - Epoch(train) [29][880/940] lr: 1.0000e-03 eta: 3:16:39 time: 0.5898 data_time: 0.0074 memory: 15293 grad_norm: 4.2609 loss: 0.5662 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5662 2023/05/27 15:25:33 - mmengine - INFO - Epoch(train) [29][900/940] lr: 1.0000e-03 eta: 3:16:27 time: 0.5904 data_time: 0.0076 memory: 15293 grad_norm: 5.1903 loss: 0.6928 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6928 2023/05/27 15:25:45 - mmengine - INFO - Epoch(train) [29][920/940] lr: 1.0000e-03 eta: 3:16:15 time: 0.5980 data_time: 0.0074 memory: 15293 grad_norm: 4.6714 loss: 0.4932 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4932 2023/05/27 15:25:57 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:25:57 - mmengine - INFO - Epoch(train) [29][940/940] lr: 1.0000e-03 eta: 3:16:02 time: 0.5718 data_time: 0.0071 memory: 15293 grad_norm: 6.4361 loss: 0.6330 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6330 2023/05/27 15:26:03 - mmengine - INFO - Epoch(val) [29][20/78] eta: 0:00:18 time: 0.3184 data_time: 0.1334 memory: 2851 2023/05/27 15:26:07 - mmengine - INFO - Epoch(val) [29][40/78] eta: 0:00:10 time: 0.2175 data_time: 0.0319 memory: 2851 2023/05/27 15:26:12 - mmengine - INFO - Epoch(val) [29][60/78] eta: 0:00:04 time: 0.2311 data_time: 0.0461 memory: 2851 2023/05/27 15:27:29 - mmengine - INFO - Epoch(val) [29][78/78] acc/top1: 0.7693 acc/top5: 0.9283 acc/mean1: 0.7692 data_time: 0.0545 time: 0.2366 2023/05/27 15:27:43 - mmengine - INFO - Epoch(train) [30][ 20/940] lr: 1.0000e-03 eta: 3:15:55 time: 0.6863 data_time: 0.0621 memory: 15293 grad_norm: 4.2981 loss: 0.6052 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6052 2023/05/27 15:27:55 - mmengine - INFO - Epoch(train) [30][ 40/940] lr: 1.0000e-03 eta: 3:15:42 time: 0.5838 data_time: 0.0075 memory: 15293 grad_norm: 4.6226 loss: 0.7316 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7316 2023/05/27 15:28:07 - mmengine - INFO - Epoch(train) [30][ 60/940] lr: 1.0000e-03 eta: 3:15:30 time: 0.6005 data_time: 0.0074 memory: 15293 grad_norm: 4.7610 loss: 0.6550 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6550 2023/05/27 15:28:18 - mmengine - INFO - Epoch(train) [30][ 80/940] lr: 1.0000e-03 eta: 3:15:18 time: 0.5872 data_time: 0.0075 memory: 15293 grad_norm: 4.5341 loss: 0.7997 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7997 2023/05/27 15:28:30 - mmengine - INFO - Epoch(train) [30][100/940] lr: 1.0000e-03 eta: 3:15:06 time: 0.5910 data_time: 0.0078 memory: 15293 grad_norm: 4.4356 loss: 0.7373 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7373 2023/05/27 15:28:42 - mmengine - INFO - Epoch(train) [30][120/940] lr: 1.0000e-03 eta: 3:14:53 time: 0.5849 data_time: 0.0073 memory: 15293 grad_norm: 4.7226 loss: 0.7329 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7329 2023/05/27 15:28:54 - mmengine - INFO - Epoch(train) [30][140/940] lr: 1.0000e-03 eta: 3:14:41 time: 0.5856 data_time: 0.0077 memory: 15293 grad_norm: 4.0878 loss: 0.6325 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6325 2023/05/27 15:29:05 - mmengine - INFO - Epoch(train) [30][160/940] lr: 1.0000e-03 eta: 3:14:28 time: 0.5850 data_time: 0.0074 memory: 15293 grad_norm: 4.4330 loss: 0.6892 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6892 2023/05/27 15:29:17 - mmengine - INFO - Epoch(train) [30][180/940] lr: 1.0000e-03 eta: 3:14:16 time: 0.5889 data_time: 0.0076 memory: 15293 grad_norm: 4.6245 loss: 0.7170 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7170 2023/05/27 15:29:29 - mmengine - INFO - Epoch(train) [30][200/940] lr: 1.0000e-03 eta: 3:14:04 time: 0.5895 data_time: 0.0076 memory: 15293 grad_norm: 5.3487 loss: 0.5700 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5700 2023/05/27 15:29:41 - mmengine - INFO - Epoch(train) [30][220/940] lr: 1.0000e-03 eta: 3:13:52 time: 0.5942 data_time: 0.0075 memory: 15293 grad_norm: 4.7045 loss: 0.5928 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.5928 2023/05/27 15:29:52 - mmengine - INFO - Epoch(train) [30][240/940] lr: 1.0000e-03 eta: 3:13:40 time: 0.5892 data_time: 0.0076 memory: 15293 grad_norm: 4.4446 loss: 0.6904 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6904 2023/05/27 15:30:04 - mmengine - INFO - Epoch(train) [30][260/940] lr: 1.0000e-03 eta: 3:13:28 time: 0.5924 data_time: 0.0077 memory: 15293 grad_norm: 4.4559 loss: 0.5725 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5725 2023/05/27 15:30:16 - mmengine - INFO - Epoch(train) [30][280/940] lr: 1.0000e-03 eta: 3:13:15 time: 0.5900 data_time: 0.0076 memory: 15293 grad_norm: 4.4715 loss: 0.5301 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5301 2023/05/27 15:30:28 - mmengine - INFO - Epoch(train) [30][300/940] lr: 1.0000e-03 eta: 3:13:03 time: 0.5870 data_time: 0.0078 memory: 15293 grad_norm: 4.8897 loss: 0.5798 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5798 2023/05/27 15:30:40 - mmengine - INFO - Epoch(train) [30][320/940] lr: 1.0000e-03 eta: 3:12:51 time: 0.5930 data_time: 0.0076 memory: 15293 grad_norm: 5.0053 loss: 0.6005 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6005 2023/05/27 15:30:52 - mmengine - INFO - Epoch(train) [30][340/940] lr: 1.0000e-03 eta: 3:12:39 time: 0.5922 data_time: 0.0075 memory: 15293 grad_norm: 4.6151 loss: 0.8119 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8119 2023/05/27 15:31:03 - mmengine - INFO - Epoch(train) [30][360/940] lr: 1.0000e-03 eta: 3:12:26 time: 0.5869 data_time: 0.0077 memory: 15293 grad_norm: 4.7694 loss: 0.5501 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5501 2023/05/27 15:31:15 - mmengine - INFO - Epoch(train) [30][380/940] lr: 1.0000e-03 eta: 3:12:14 time: 0.5923 data_time: 0.0076 memory: 15293 grad_norm: 5.8491 loss: 0.7061 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7061 2023/05/27 15:31:27 - mmengine - INFO - Epoch(train) [30][400/940] lr: 1.0000e-03 eta: 3:12:02 time: 0.5880 data_time: 0.0078 memory: 15293 grad_norm: 4.7394 loss: 0.4736 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4736 2023/05/27 15:31:39 - mmengine - INFO - Epoch(train) [30][420/940] lr: 1.0000e-03 eta: 3:11:50 time: 0.5905 data_time: 0.0077 memory: 15293 grad_norm: 4.5663 loss: 0.5879 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5879 2023/05/27 15:31:51 - mmengine - INFO - Epoch(train) [30][440/940] lr: 1.0000e-03 eta: 3:11:38 time: 0.5909 data_time: 0.0074 memory: 15293 grad_norm: 4.3716 loss: 0.6176 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6176 2023/05/27 15:32:02 - mmengine - INFO - Epoch(train) [30][460/940] lr: 1.0000e-03 eta: 3:11:25 time: 0.5886 data_time: 0.0074 memory: 15293 grad_norm: 4.3089 loss: 0.6458 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6458 2023/05/27 15:32:14 - mmengine - INFO - Epoch(train) [30][480/940] lr: 1.0000e-03 eta: 3:11:13 time: 0.5931 data_time: 0.0075 memory: 15293 grad_norm: 4.5381 loss: 0.5712 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5712 2023/05/27 15:32:26 - mmengine - INFO - Epoch(train) [30][500/940] lr: 1.0000e-03 eta: 3:11:01 time: 0.5868 data_time: 0.0075 memory: 15293 grad_norm: 4.7868 loss: 0.5574 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5574 2023/05/27 15:32:38 - mmengine - INFO - Epoch(train) [30][520/940] lr: 1.0000e-03 eta: 3:10:49 time: 0.5861 data_time: 0.0074 memory: 15293 grad_norm: 4.7641 loss: 0.4958 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4958 2023/05/27 15:32:49 - mmengine - INFO - Epoch(train) [30][540/940] lr: 1.0000e-03 eta: 3:10:36 time: 0.5871 data_time: 0.0076 memory: 15293 grad_norm: 4.6803 loss: 0.4989 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4989 2023/05/27 15:33:01 - mmengine - INFO - Epoch(train) [30][560/940] lr: 1.0000e-03 eta: 3:10:25 time: 0.5970 data_time: 0.0076 memory: 15293 grad_norm: 4.2559 loss: 0.7654 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7654 2023/05/27 15:33:13 - mmengine - INFO - Epoch(train) [30][580/940] lr: 1.0000e-03 eta: 3:10:12 time: 0.5859 data_time: 0.0077 memory: 15293 grad_norm: 4.9586 loss: 0.4837 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4837 2023/05/27 15:33:25 - mmengine - INFO - Epoch(train) [30][600/940] lr: 1.0000e-03 eta: 3:10:00 time: 0.5992 data_time: 0.0078 memory: 15293 grad_norm: 4.5430 loss: 0.8181 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8181 2023/05/27 15:33:37 - mmengine - INFO - Epoch(train) [30][620/940] lr: 1.0000e-03 eta: 3:09:48 time: 0.5912 data_time: 0.0076 memory: 15293 grad_norm: 4.8661 loss: 0.7684 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7684 2023/05/27 15:33:49 - mmengine - INFO - Epoch(train) [30][640/940] lr: 1.0000e-03 eta: 3:09:36 time: 0.5866 data_time: 0.0075 memory: 15293 grad_norm: 4.1798 loss: 0.6092 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6092 2023/05/27 15:34:00 - mmengine - INFO - Epoch(train) [30][660/940] lr: 1.0000e-03 eta: 3:09:24 time: 0.5885 data_time: 0.0077 memory: 15293 grad_norm: 4.5657 loss: 0.6618 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6618 2023/05/27 15:34:12 - mmengine - INFO - Epoch(train) [30][680/940] lr: 1.0000e-03 eta: 3:09:12 time: 0.5929 data_time: 0.0075 memory: 15293 grad_norm: 4.4564 loss: 0.6922 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6922 2023/05/27 15:34:24 - mmengine - INFO - Epoch(train) [30][700/940] lr: 1.0000e-03 eta: 3:09:00 time: 0.5958 data_time: 0.0080 memory: 15293 grad_norm: 4.5247 loss: 0.9196 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.9196 2023/05/27 15:34:36 - mmengine - INFO - Epoch(train) [30][720/940] lr: 1.0000e-03 eta: 3:08:48 time: 0.5970 data_time: 0.0077 memory: 15293 grad_norm: 4.4186 loss: 0.6480 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6480 2023/05/27 15:34:48 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:34:48 - mmengine - INFO - Epoch(train) [30][740/940] lr: 1.0000e-03 eta: 3:08:36 time: 0.5965 data_time: 0.0074 memory: 15293 grad_norm: 4.6840 loss: 0.7006 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7006 2023/05/27 15:35:00 - mmengine - INFO - Epoch(train) [30][760/940] lr: 1.0000e-03 eta: 3:08:24 time: 0.5990 data_time: 0.0078 memory: 15293 grad_norm: 4.4876 loss: 0.7152 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7152 2023/05/27 15:35:12 - mmengine - INFO - Epoch(train) [30][780/940] lr: 1.0000e-03 eta: 3:08:12 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.3251 loss: 0.6488 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6488 2023/05/27 15:35:23 - mmengine - INFO - Epoch(train) [30][800/940] lr: 1.0000e-03 eta: 3:08:00 time: 0.5895 data_time: 0.0074 memory: 15293 grad_norm: 4.3655 loss: 0.6210 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6210 2023/05/27 15:35:35 - mmengine - INFO - Epoch(train) [30][820/940] lr: 1.0000e-03 eta: 3:07:48 time: 0.5927 data_time: 0.0074 memory: 15293 grad_norm: 5.4017 loss: 0.7989 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7989 2023/05/27 15:35:47 - mmengine - INFO - Epoch(train) [30][840/940] lr: 1.0000e-03 eta: 3:07:36 time: 0.5905 data_time: 0.0075 memory: 15293 grad_norm: 4.2192 loss: 0.5860 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5860 2023/05/27 15:35:59 - mmengine - INFO - Epoch(train) [30][860/940] lr: 1.0000e-03 eta: 3:07:23 time: 0.5898 data_time: 0.0078 memory: 15293 grad_norm: 4.4409 loss: 0.6468 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6468 2023/05/27 15:36:11 - mmengine - INFO - Epoch(train) [30][880/940] lr: 1.0000e-03 eta: 3:07:11 time: 0.5950 data_time: 0.0075 memory: 15293 grad_norm: 4.7451 loss: 0.6983 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6983 2023/05/27 15:36:23 - mmengine - INFO - Epoch(train) [30][900/940] lr: 1.0000e-03 eta: 3:06:59 time: 0.5857 data_time: 0.0076 memory: 15293 grad_norm: 4.7741 loss: 0.7637 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7637 2023/05/27 15:36:34 - mmengine - INFO - Epoch(train) [30][920/940] lr: 1.0000e-03 eta: 3:06:47 time: 0.5947 data_time: 0.0075 memory: 15293 grad_norm: 4.6537 loss: 0.6086 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6086 2023/05/27 15:36:46 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:36:46 - mmengine - INFO - Epoch(train) [30][940/940] lr: 1.0000e-03 eta: 3:06:35 time: 0.5868 data_time: 0.0074 memory: 15293 grad_norm: 4.5086 loss: 0.5875 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5875 2023/05/27 15:36:46 - mmengine - INFO - Saving checkpoint at 30 epochs 2023/05/27 15:36:56 - mmengine - INFO - Epoch(val) [30][20/78] eta: 0:00:17 time: 0.3062 data_time: 0.1219 memory: 2851 2023/05/27 15:37:00 - mmengine - INFO - Epoch(val) [30][40/78] eta: 0:00:10 time: 0.2220 data_time: 0.0370 memory: 2851 2023/05/27 15:37:05 - mmengine - INFO - Epoch(val) [30][60/78] eta: 0:00:04 time: 0.2288 data_time: 0.0448 memory: 2851 2023/05/27 15:37:34 - mmengine - INFO - Epoch(val) [30][78/78] acc/top1: 0.7706 acc/top5: 0.9292 acc/mean1: 0.7705 data_time: 0.0526 time: 0.2340 2023/05/27 15:37:48 - mmengine - INFO - Epoch(train) [31][ 20/940] lr: 1.0000e-03 eta: 3:06:28 time: 0.7050 data_time: 0.0815 memory: 15293 grad_norm: 4.5530 loss: 0.3635 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.3635 2023/05/27 15:38:00 - mmengine - INFO - Epoch(train) [31][ 40/940] lr: 1.0000e-03 eta: 3:06:16 time: 0.5897 data_time: 0.0076 memory: 15293 grad_norm: 4.4490 loss: 0.5194 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5194 2023/05/27 15:38:12 - mmengine - INFO - Epoch(train) [31][ 60/940] lr: 1.0000e-03 eta: 3:06:03 time: 0.5876 data_time: 0.0075 memory: 15293 grad_norm: 4.9437 loss: 0.5136 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5136 2023/05/27 15:38:24 - mmengine - INFO - Epoch(train) [31][ 80/940] lr: 1.0000e-03 eta: 3:05:51 time: 0.5966 data_time: 0.0076 memory: 15293 grad_norm: 4.7734 loss: 0.7111 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7111 2023/05/27 15:38:35 - mmengine - INFO - Epoch(train) [31][100/940] lr: 1.0000e-03 eta: 3:05:39 time: 0.5865 data_time: 0.0076 memory: 15293 grad_norm: 4.7739 loss: 0.5713 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5713 2023/05/27 15:38:47 - mmengine - INFO - Epoch(train) [31][120/940] lr: 1.0000e-03 eta: 3:05:27 time: 0.5861 data_time: 0.0075 memory: 15293 grad_norm: 4.6001 loss: 0.6781 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6781 2023/05/27 15:38:59 - mmengine - INFO - Epoch(train) [31][140/940] lr: 1.0000e-03 eta: 3:05:15 time: 0.5968 data_time: 0.0078 memory: 15293 grad_norm: 4.7504 loss: 0.5948 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5948 2023/05/27 15:39:11 - mmengine - INFO - Epoch(train) [31][160/940] lr: 1.0000e-03 eta: 3:05:03 time: 0.5918 data_time: 0.0075 memory: 15293 grad_norm: 4.8125 loss: 0.3803 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.3803 2023/05/27 15:39:23 - mmengine - INFO - Epoch(train) [31][180/940] lr: 1.0000e-03 eta: 3:04:51 time: 0.5950 data_time: 0.0081 memory: 15293 grad_norm: 4.7777 loss: 0.7589 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7589 2023/05/27 15:39:34 - mmengine - INFO - Epoch(train) [31][200/940] lr: 1.0000e-03 eta: 3:04:39 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.9448 loss: 0.6207 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6207 2023/05/27 15:39:46 - mmengine - INFO - Epoch(train) [31][220/940] lr: 1.0000e-03 eta: 3:04:26 time: 0.5869 data_time: 0.0077 memory: 15293 grad_norm: 4.5520 loss: 0.7215 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7215 2023/05/27 15:39:58 - mmengine - INFO - Epoch(train) [31][240/940] lr: 1.0000e-03 eta: 3:04:14 time: 0.5911 data_time: 0.0077 memory: 15293 grad_norm: 4.7882 loss: 0.6441 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6441 2023/05/27 15:40:10 - mmengine - INFO - Epoch(train) [31][260/940] lr: 1.0000e-03 eta: 3:04:02 time: 0.5967 data_time: 0.0076 memory: 15293 grad_norm: 4.9940 loss: 0.6757 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6757 2023/05/27 15:40:22 - mmengine - INFO - Epoch(train) [31][280/940] lr: 1.0000e-03 eta: 3:03:50 time: 0.5939 data_time: 0.0076 memory: 15293 grad_norm: 4.2666 loss: 0.6854 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6854 2023/05/27 15:40:34 - mmengine - INFO - Epoch(train) [31][300/940] lr: 1.0000e-03 eta: 3:03:38 time: 0.5907 data_time: 0.0074 memory: 15293 grad_norm: 4.5922 loss: 0.5480 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5480 2023/05/27 15:40:45 - mmengine - INFO - Epoch(train) [31][320/940] lr: 1.0000e-03 eta: 3:03:26 time: 0.5895 data_time: 0.0075 memory: 15293 grad_norm: 4.2820 loss: 0.5747 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5747 2023/05/27 15:40:57 - mmengine - INFO - Epoch(train) [31][340/940] lr: 1.0000e-03 eta: 3:03:14 time: 0.5877 data_time: 0.0075 memory: 15293 grad_norm: 4.5837 loss: 0.6039 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6039 2023/05/27 15:41:09 - mmengine - INFO - Epoch(train) [31][360/940] lr: 1.0000e-03 eta: 3:03:02 time: 0.5882 data_time: 0.0076 memory: 15293 grad_norm: 4.9429 loss: 0.6336 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6336 2023/05/27 15:41:21 - mmengine - INFO - Epoch(train) [31][380/940] lr: 1.0000e-03 eta: 3:02:49 time: 0.5893 data_time: 0.0075 memory: 15293 grad_norm: 4.5646 loss: 0.6057 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6057 2023/05/27 15:41:32 - mmengine - INFO - Epoch(train) [31][400/940] lr: 1.0000e-03 eta: 3:02:37 time: 0.5895 data_time: 0.0078 memory: 15293 grad_norm: 4.5118 loss: 0.5872 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5872 2023/05/27 15:41:44 - mmengine - INFO - Epoch(train) [31][420/940] lr: 1.0000e-03 eta: 3:02:25 time: 0.5912 data_time: 0.0077 memory: 15293 grad_norm: 5.0591 loss: 0.5749 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5749 2023/05/27 15:41:56 - mmengine - INFO - Epoch(train) [31][440/940] lr: 1.0000e-03 eta: 3:02:13 time: 0.5883 data_time: 0.0075 memory: 15293 grad_norm: 4.8538 loss: 0.8503 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8503 2023/05/27 15:42:08 - mmengine - INFO - Epoch(train) [31][460/940] lr: 1.0000e-03 eta: 3:02:01 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.3647 loss: 0.7096 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7096 2023/05/27 15:42:20 - mmengine - INFO - Epoch(train) [31][480/940] lr: 1.0000e-03 eta: 3:01:49 time: 0.5958 data_time: 0.0075 memory: 15293 grad_norm: 5.0570 loss: 0.6742 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6742 2023/05/27 15:42:31 - mmengine - INFO - Epoch(train) [31][500/940] lr: 1.0000e-03 eta: 3:01:36 time: 0.5856 data_time: 0.0075 memory: 15293 grad_norm: 4.5950 loss: 0.8561 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8561 2023/05/27 15:42:43 - mmengine - INFO - Epoch(train) [31][520/940] lr: 1.0000e-03 eta: 3:01:24 time: 0.5924 data_time: 0.0077 memory: 15293 grad_norm: 4.5979 loss: 0.7092 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7092 2023/05/27 15:42:55 - mmengine - INFO - Epoch(train) [31][540/940] lr: 1.0000e-03 eta: 3:01:12 time: 0.5896 data_time: 0.0075 memory: 15293 grad_norm: 4.8915 loss: 0.4768 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4768 2023/05/27 15:43:07 - mmengine - INFO - Epoch(train) [31][560/940] lr: 1.0000e-03 eta: 3:01:00 time: 0.5899 data_time: 0.0075 memory: 15293 grad_norm: 4.3207 loss: 0.3853 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.3853 2023/05/27 15:43:19 - mmengine - INFO - Epoch(train) [31][580/940] lr: 1.0000e-03 eta: 3:00:48 time: 0.5855 data_time: 0.0075 memory: 15293 grad_norm: 4.7067 loss: 0.5007 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5007 2023/05/27 15:43:30 - mmengine - INFO - Epoch(train) [31][600/940] lr: 1.0000e-03 eta: 3:00:36 time: 0.5948 data_time: 0.0075 memory: 15293 grad_norm: 4.4515 loss: 0.7595 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7595 2023/05/27 15:43:42 - mmengine - INFO - Epoch(train) [31][620/940] lr: 1.0000e-03 eta: 3:00:24 time: 0.5881 data_time: 0.0079 memory: 15293 grad_norm: 4.3316 loss: 0.4820 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4820 2023/05/27 15:43:54 - mmengine - INFO - Epoch(train) [31][640/940] lr: 1.0000e-03 eta: 3:00:12 time: 0.5889 data_time: 0.0075 memory: 15293 grad_norm: 4.2710 loss: 0.7351 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7351 2023/05/27 15:44:06 - mmengine - INFO - Epoch(train) [31][660/940] lr: 1.0000e-03 eta: 3:00:00 time: 0.5979 data_time: 0.0076 memory: 15293 grad_norm: 4.3872 loss: 0.8580 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8580 2023/05/27 15:44:18 - mmengine - INFO - Epoch(train) [31][680/940] lr: 1.0000e-03 eta: 2:59:48 time: 0.5966 data_time: 0.0074 memory: 15293 grad_norm: 4.3589 loss: 0.5933 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5933 2023/05/27 15:44:30 - mmengine - INFO - Epoch(train) [31][700/940] lr: 1.0000e-03 eta: 2:59:36 time: 0.5895 data_time: 0.0077 memory: 15293 grad_norm: 4.5504 loss: 0.5208 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5208 2023/05/27 15:44:42 - mmengine - INFO - Epoch(train) [31][720/940] lr: 1.0000e-03 eta: 2:59:24 time: 0.5949 data_time: 0.0073 memory: 15293 grad_norm: 5.3280 loss: 0.7139 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7139 2023/05/27 15:44:53 - mmengine - INFO - Epoch(train) [31][740/940] lr: 1.0000e-03 eta: 2:59:12 time: 0.5869 data_time: 0.0082 memory: 15293 grad_norm: 4.2715 loss: 0.5754 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5754 2023/05/27 15:45:05 - mmengine - INFO - Epoch(train) [31][760/940] lr: 1.0000e-03 eta: 2:58:59 time: 0.5871 data_time: 0.0077 memory: 15293 grad_norm: 4.5250 loss: 0.5341 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5341 2023/05/27 15:45:17 - mmengine - INFO - Epoch(train) [31][780/940] lr: 1.0000e-03 eta: 2:58:47 time: 0.5888 data_time: 0.0076 memory: 15293 grad_norm: 4.5185 loss: 0.6937 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6937 2023/05/27 15:45:29 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:45:29 - mmengine - INFO - Epoch(train) [31][800/940] lr: 1.0000e-03 eta: 2:58:35 time: 0.5930 data_time: 0.0075 memory: 15293 grad_norm: 4.5498 loss: 0.6705 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6705 2023/05/27 15:45:40 - mmengine - INFO - Epoch(train) [31][820/940] lr: 1.0000e-03 eta: 2:58:23 time: 0.5907 data_time: 0.0075 memory: 15293 grad_norm: 4.5557 loss: 0.5615 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5615 2023/05/27 15:45:52 - mmengine - INFO - Epoch(train) [31][840/940] lr: 1.0000e-03 eta: 2:58:11 time: 0.5862 data_time: 0.0076 memory: 15293 grad_norm: 4.2198 loss: 0.7079 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7079 2023/05/27 15:46:04 - mmengine - INFO - Epoch(train) [31][860/940] lr: 1.0000e-03 eta: 2:57:59 time: 0.5865 data_time: 0.0072 memory: 15293 grad_norm: 4.5269 loss: 0.6718 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6718 2023/05/27 15:46:16 - mmengine - INFO - Epoch(train) [31][880/940] lr: 1.0000e-03 eta: 2:57:47 time: 0.5933 data_time: 0.0074 memory: 15293 grad_norm: 4.3138 loss: 0.6059 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6059 2023/05/27 15:46:28 - mmengine - INFO - Epoch(train) [31][900/940] lr: 1.0000e-03 eta: 2:57:35 time: 0.5904 data_time: 0.0078 memory: 15293 grad_norm: 4.2475 loss: 0.5831 top1_acc: 0.3750 top5_acc: 0.8750 loss_cls: 0.5831 2023/05/27 15:46:39 - mmengine - INFO - Epoch(train) [31][920/940] lr: 1.0000e-03 eta: 2:57:22 time: 0.5904 data_time: 0.0076 memory: 15293 grad_norm: 4.2346 loss: 0.8410 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.8410 2023/05/27 15:46:51 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:46:51 - mmengine - INFO - Epoch(train) [31][940/940] lr: 1.0000e-03 eta: 2:57:10 time: 0.5691 data_time: 0.0075 memory: 15293 grad_norm: 5.0109 loss: 0.7907 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.7907 2023/05/27 15:46:57 - mmengine - INFO - Epoch(val) [31][20/78] eta: 0:00:17 time: 0.2953 data_time: 0.1104 memory: 2851 2023/05/27 15:47:01 - mmengine - INFO - Epoch(val) [31][40/78] eta: 0:00:09 time: 0.2218 data_time: 0.0368 memory: 2851 2023/05/27 15:47:06 - mmengine - INFO - Epoch(val) [31][60/78] eta: 0:00:04 time: 0.2318 data_time: 0.0467 memory: 2851 2023/05/27 15:47:11 - mmengine - INFO - Epoch(val) [31][78/78] acc/top1: 0.7704 acc/top5: 0.9283 acc/mean1: 0.7703 data_time: 0.0501 time: 0.2321 2023/05/27 15:47:25 - mmengine - INFO - Epoch(train) [32][ 20/940] lr: 1.0000e-03 eta: 2:57:01 time: 0.6860 data_time: 0.0859 memory: 15293 grad_norm: 4.8148 loss: 0.7046 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7046 2023/05/27 15:47:37 - mmengine - INFO - Epoch(train) [32][ 40/940] lr: 1.0000e-03 eta: 2:56:49 time: 0.5875 data_time: 0.0075 memory: 15293 grad_norm: 4.4405 loss: 0.5692 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5692 2023/05/27 15:47:49 - mmengine - INFO - Epoch(train) [32][ 60/940] lr: 1.0000e-03 eta: 2:56:37 time: 0.5972 data_time: 0.0078 memory: 15293 grad_norm: 4.5084 loss: 0.6194 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6194 2023/05/27 15:48:00 - mmengine - INFO - Epoch(train) [32][ 80/940] lr: 1.0000e-03 eta: 2:56:25 time: 0.5870 data_time: 0.0076 memory: 15293 grad_norm: 4.3582 loss: 0.6657 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6657 2023/05/27 15:48:12 - mmengine - INFO - Epoch(train) [32][100/940] lr: 1.0000e-03 eta: 2:56:13 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.5522 loss: 0.6443 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6443 2023/05/27 15:48:24 - mmengine - INFO - Epoch(train) [32][120/940] lr: 1.0000e-03 eta: 2:56:01 time: 0.5937 data_time: 0.0075 memory: 15293 grad_norm: 4.9771 loss: 0.7498 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7498 2023/05/27 15:48:36 - mmengine - INFO - Epoch(train) [32][140/940] lr: 1.0000e-03 eta: 2:55:49 time: 0.5943 data_time: 0.0075 memory: 15293 grad_norm: 6.4912 loss: 0.6373 top1_acc: 0.3750 top5_acc: 0.8750 loss_cls: 0.6373 2023/05/27 15:48:48 - mmengine - INFO - Epoch(train) [32][160/940] lr: 1.0000e-03 eta: 2:55:37 time: 0.5887 data_time: 0.0077 memory: 15293 grad_norm: 4.5474 loss: 0.4935 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4935 2023/05/27 15:49:00 - mmengine - INFO - Epoch(train) [32][180/940] lr: 1.0000e-03 eta: 2:55:25 time: 0.5945 data_time: 0.0076 memory: 15293 grad_norm: 5.5667 loss: 0.7158 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7158 2023/05/27 15:49:11 - mmengine - INFO - Epoch(train) [32][200/940] lr: 1.0000e-03 eta: 2:55:13 time: 0.5956 data_time: 0.0076 memory: 15293 grad_norm: 4.3625 loss: 0.7416 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7416 2023/05/27 15:49:23 - mmengine - INFO - Epoch(train) [32][220/940] lr: 1.0000e-03 eta: 2:55:01 time: 0.5980 data_time: 0.0079 memory: 15293 grad_norm: 4.8068 loss: 0.7178 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7178 2023/05/27 15:49:35 - mmengine - INFO - Epoch(train) [32][240/940] lr: 1.0000e-03 eta: 2:54:49 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.5565 loss: 0.7037 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7037 2023/05/27 15:49:47 - mmengine - INFO - Epoch(train) [32][260/940] lr: 1.0000e-03 eta: 2:54:37 time: 0.5916 data_time: 0.0079 memory: 15293 grad_norm: 5.2492 loss: 0.6158 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6158 2023/05/27 15:49:59 - mmengine - INFO - Epoch(train) [32][280/940] lr: 1.0000e-03 eta: 2:54:25 time: 0.5910 data_time: 0.0077 memory: 15293 grad_norm: 4.0388 loss: 0.5504 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5504 2023/05/27 15:50:11 - mmengine - INFO - Epoch(train) [32][300/940] lr: 1.0000e-03 eta: 2:54:13 time: 0.5870 data_time: 0.0076 memory: 15293 grad_norm: 4.9070 loss: 0.6197 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6197 2023/05/27 15:50:22 - mmengine - INFO - Epoch(train) [32][320/940] lr: 1.0000e-03 eta: 2:54:00 time: 0.5875 data_time: 0.0076 memory: 15293 grad_norm: 4.8623 loss: 0.7322 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7322 2023/05/27 15:50:34 - mmengine - INFO - Epoch(train) [32][340/940] lr: 1.0000e-03 eta: 2:53:49 time: 0.5971 data_time: 0.0077 memory: 15293 grad_norm: 4.6158 loss: 0.5713 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5713 2023/05/27 15:50:46 - mmengine - INFO - Epoch(train) [32][360/940] lr: 1.0000e-03 eta: 2:53:36 time: 0.5877 data_time: 0.0076 memory: 15293 grad_norm: 5.2961 loss: 0.7967 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7967 2023/05/27 15:50:58 - mmengine - INFO - Epoch(train) [32][380/940] lr: 1.0000e-03 eta: 2:53:24 time: 0.5874 data_time: 0.0077 memory: 15293 grad_norm: 4.6227 loss: 0.6260 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6260 2023/05/27 15:51:10 - mmengine - INFO - Epoch(train) [32][400/940] lr: 1.0000e-03 eta: 2:53:12 time: 0.5926 data_time: 0.0074 memory: 15293 grad_norm: 4.8680 loss: 0.5727 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5727 2023/05/27 15:51:21 - mmengine - INFO - Epoch(train) [32][420/940] lr: 1.0000e-03 eta: 2:53:00 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.8660 loss: 0.7044 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7044 2023/05/27 15:51:33 - mmengine - INFO - Epoch(train) [32][440/940] lr: 1.0000e-03 eta: 2:52:48 time: 0.5899 data_time: 0.0074 memory: 15293 grad_norm: 4.5294 loss: 0.5095 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5095 2023/05/27 15:51:45 - mmengine - INFO - Epoch(train) [32][460/940] lr: 1.0000e-03 eta: 2:52:36 time: 0.5866 data_time: 0.0074 memory: 15293 grad_norm: 4.3843 loss: 0.6074 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6074 2023/05/27 15:51:57 - mmengine - INFO - Epoch(train) [32][480/940] lr: 1.0000e-03 eta: 2:52:24 time: 0.5891 data_time: 0.0076 memory: 15293 grad_norm: 4.7834 loss: 0.5655 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5655 2023/05/27 15:52:09 - mmengine - INFO - Epoch(train) [32][500/940] lr: 1.0000e-03 eta: 2:52:12 time: 0.5935 data_time: 0.0077 memory: 15293 grad_norm: 4.9728 loss: 0.8413 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8413 2023/05/27 15:52:20 - mmengine - INFO - Epoch(train) [32][520/940] lr: 1.0000e-03 eta: 2:52:00 time: 0.5902 data_time: 0.0074 memory: 15293 grad_norm: 4.4137 loss: 0.6446 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6446 2023/05/27 15:52:32 - mmengine - INFO - Epoch(train) [32][540/940] lr: 1.0000e-03 eta: 2:51:48 time: 0.5928 data_time: 0.0075 memory: 15293 grad_norm: 4.7269 loss: 0.5452 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5452 2023/05/27 15:52:44 - mmengine - INFO - Epoch(train) [32][560/940] lr: 1.0000e-03 eta: 2:51:36 time: 0.5883 data_time: 0.0075 memory: 15293 grad_norm: 4.4501 loss: 0.7502 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7502 2023/05/27 15:52:56 - mmengine - INFO - Epoch(train) [32][580/940] lr: 1.0000e-03 eta: 2:51:24 time: 0.5916 data_time: 0.0076 memory: 15293 grad_norm: 4.3843 loss: 0.6418 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6418 2023/05/27 15:53:08 - mmengine - INFO - Epoch(train) [32][600/940] lr: 1.0000e-03 eta: 2:51:12 time: 0.5996 data_time: 0.0073 memory: 15293 grad_norm: 5.2660 loss: 0.9704 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.9704 2023/05/27 15:53:20 - mmengine - INFO - Epoch(train) [32][620/940] lr: 1.0000e-03 eta: 2:51:00 time: 0.5947 data_time: 0.0075 memory: 15293 grad_norm: 4.4771 loss: 0.5876 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5876 2023/05/27 15:53:32 - mmengine - INFO - Epoch(train) [32][640/940] lr: 1.0000e-03 eta: 2:50:48 time: 0.5874 data_time: 0.0075 memory: 15293 grad_norm: 4.6510 loss: 0.5253 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5253 2023/05/27 15:53:43 - mmengine - INFO - Epoch(train) [32][660/940] lr: 1.0000e-03 eta: 2:50:36 time: 0.5871 data_time: 0.0077 memory: 15293 grad_norm: 5.0127 loss: 0.6387 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6387 2023/05/27 15:53:55 - mmengine - INFO - Epoch(train) [32][680/940] lr: 1.0000e-03 eta: 2:50:24 time: 0.5944 data_time: 0.0075 memory: 15293 grad_norm: 4.8501 loss: 0.7353 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7353 2023/05/27 15:54:07 - mmengine - INFO - Epoch(train) [32][700/940] lr: 1.0000e-03 eta: 2:50:12 time: 0.5939 data_time: 0.0076 memory: 15293 grad_norm: 5.1109 loss: 0.5975 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5975 2023/05/27 15:54:19 - mmengine - INFO - Epoch(train) [32][720/940] lr: 1.0000e-03 eta: 2:50:00 time: 0.5959 data_time: 0.0075 memory: 15293 grad_norm: 4.5231 loss: 0.6975 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6975 2023/05/27 15:54:31 - mmengine - INFO - Epoch(train) [32][740/940] lr: 1.0000e-03 eta: 2:49:48 time: 0.5957 data_time: 0.0074 memory: 15293 grad_norm: 6.1906 loss: 0.7476 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7476 2023/05/27 15:54:43 - mmengine - INFO - Epoch(train) [32][760/940] lr: 1.0000e-03 eta: 2:49:36 time: 0.5936 data_time: 0.0075 memory: 15293 grad_norm: 5.2261 loss: 0.4751 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4751 2023/05/27 15:54:55 - mmengine - INFO - Epoch(train) [32][780/940] lr: 1.0000e-03 eta: 2:49:24 time: 0.5898 data_time: 0.0079 memory: 15293 grad_norm: 4.5549 loss: 0.6505 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6505 2023/05/27 15:55:06 - mmengine - INFO - Epoch(train) [32][800/940] lr: 1.0000e-03 eta: 2:49:12 time: 0.5871 data_time: 0.0073 memory: 15293 grad_norm: 4.5895 loss: 0.7181 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7181 2023/05/27 15:55:18 - mmengine - INFO - Epoch(train) [32][820/940] lr: 1.0000e-03 eta: 2:49:00 time: 0.5945 data_time: 0.0076 memory: 15293 grad_norm: 4.6160 loss: 0.5996 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5996 2023/05/27 15:55:30 - mmengine - INFO - Epoch(train) [32][840/940] lr: 1.0000e-03 eta: 2:48:48 time: 0.6015 data_time: 0.0075 memory: 15293 grad_norm: 5.3700 loss: 0.7421 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7421 2023/05/27 15:55:42 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:55:42 - mmengine - INFO - Epoch(train) [32][860/940] lr: 1.0000e-03 eta: 2:48:36 time: 0.5918 data_time: 0.0074 memory: 15293 grad_norm: 4.3700 loss: 0.6314 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6314 2023/05/27 15:55:54 - mmengine - INFO - Epoch(train) [32][880/940] lr: 1.0000e-03 eta: 2:48:24 time: 0.5888 data_time: 0.0076 memory: 15293 grad_norm: 4.6946 loss: 0.6282 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6282 2023/05/27 15:56:06 - mmengine - INFO - Epoch(train) [32][900/940] lr: 1.0000e-03 eta: 2:48:12 time: 0.5926 data_time: 0.0076 memory: 15293 grad_norm: 4.1927 loss: 0.6659 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6659 2023/05/27 15:56:18 - mmengine - INFO - Epoch(train) [32][920/940] lr: 1.0000e-03 eta: 2:48:00 time: 0.5906 data_time: 0.0074 memory: 15293 grad_norm: 4.3868 loss: 0.6492 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6492 2023/05/27 15:56:29 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 15:56:29 - mmengine - INFO - Epoch(train) [32][940/940] lr: 1.0000e-03 eta: 2:47:47 time: 0.5720 data_time: 0.0075 memory: 15293 grad_norm: 4.7344 loss: 0.6892 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6892 2023/05/27 15:56:35 - mmengine - INFO - Epoch(val) [32][20/78] eta: 0:00:17 time: 0.2992 data_time: 0.1142 memory: 2851 2023/05/27 15:56:39 - mmengine - INFO - Epoch(val) [32][40/78] eta: 0:00:09 time: 0.2123 data_time: 0.0275 memory: 2851 2023/05/27 15:56:44 - mmengine - INFO - Epoch(val) [32][60/78] eta: 0:00:04 time: 0.2333 data_time: 0.0485 memory: 2851 2023/05/27 15:56:52 - mmengine - INFO - Epoch(val) [32][78/78] acc/top1: 0.7707 acc/top5: 0.9297 acc/mean1: 0.7705 data_time: 0.0493 time: 0.2311 2023/05/27 15:57:06 - mmengine - INFO - Epoch(train) [33][ 20/940] lr: 1.0000e-03 eta: 2:47:38 time: 0.6817 data_time: 0.0724 memory: 15293 grad_norm: 4.3684 loss: 0.6834 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6834 2023/05/27 15:57:18 - mmengine - INFO - Epoch(train) [33][ 40/940] lr: 1.0000e-03 eta: 2:47:26 time: 0.5893 data_time: 0.0072 memory: 15293 grad_norm: 4.7660 loss: 0.4913 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4913 2023/05/27 15:57:30 - mmengine - INFO - Epoch(train) [33][ 60/940] lr: 1.0000e-03 eta: 2:47:14 time: 0.5899 data_time: 0.0075 memory: 15293 grad_norm: 4.4326 loss: 0.6650 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6650 2023/05/27 15:57:41 - mmengine - INFO - Epoch(train) [33][ 80/940] lr: 1.0000e-03 eta: 2:47:02 time: 0.5860 data_time: 0.0078 memory: 15293 grad_norm: 4.6108 loss: 0.5497 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5497 2023/05/27 15:57:53 - mmengine - INFO - Epoch(train) [33][100/940] lr: 1.0000e-03 eta: 2:46:50 time: 0.5914 data_time: 0.0074 memory: 15293 grad_norm: 5.1774 loss: 0.6006 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6006 2023/05/27 15:58:05 - mmengine - INFO - Epoch(train) [33][120/940] lr: 1.0000e-03 eta: 2:46:38 time: 0.5883 data_time: 0.0073 memory: 15293 grad_norm: 4.9575 loss: 0.6668 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6668 2023/05/27 15:58:17 - mmengine - INFO - Epoch(train) [33][140/940] lr: 1.0000e-03 eta: 2:46:26 time: 0.5882 data_time: 0.0075 memory: 15293 grad_norm: 4.5678 loss: 0.7324 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7324 2023/05/27 15:58:29 - mmengine - INFO - Epoch(train) [33][160/940] lr: 1.0000e-03 eta: 2:46:14 time: 0.5924 data_time: 0.0075 memory: 15293 grad_norm: 5.2502 loss: 0.8953 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8953 2023/05/27 15:58:40 - mmengine - INFO - Epoch(train) [33][180/940] lr: 1.0000e-03 eta: 2:46:01 time: 0.5855 data_time: 0.0076 memory: 15293 grad_norm: 5.0545 loss: 0.6858 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6858 2023/05/27 15:58:52 - mmengine - INFO - Epoch(train) [33][200/940] lr: 1.0000e-03 eta: 2:45:49 time: 0.5888 data_time: 0.0078 memory: 15293 grad_norm: 5.2167 loss: 0.4483 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.4483 2023/05/27 15:59:04 - mmengine - INFO - Epoch(train) [33][220/940] lr: 1.0000e-03 eta: 2:45:37 time: 0.5958 data_time: 0.0075 memory: 15293 grad_norm: 4.8035 loss: 0.5905 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5905 2023/05/27 15:59:16 - mmengine - INFO - Epoch(train) [33][240/940] lr: 1.0000e-03 eta: 2:45:26 time: 0.5995 data_time: 0.0074 memory: 15293 grad_norm: 4.9769 loss: 0.5681 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.5681 2023/05/27 15:59:28 - mmengine - INFO - Epoch(train) [33][260/940] lr: 1.0000e-03 eta: 2:45:14 time: 0.5946 data_time: 0.0076 memory: 15293 grad_norm: 4.6326 loss: 0.5722 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5722 2023/05/27 15:59:40 - mmengine - INFO - Epoch(train) [33][280/940] lr: 1.0000e-03 eta: 2:45:02 time: 0.5950 data_time: 0.0077 memory: 15293 grad_norm: 5.0850 loss: 0.6673 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6673 2023/05/27 15:59:52 - mmengine - INFO - Epoch(train) [33][300/940] lr: 1.0000e-03 eta: 2:44:50 time: 0.5892 data_time: 0.0076 memory: 15293 grad_norm: 4.4798 loss: 0.8300 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8300 2023/05/27 16:00:03 - mmengine - INFO - Epoch(train) [33][320/940] lr: 1.0000e-03 eta: 2:44:38 time: 0.5894 data_time: 0.0075 memory: 15293 grad_norm: 4.7208 loss: 0.6777 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6777 2023/05/27 16:00:15 - mmengine - INFO - Epoch(train) [33][340/940] lr: 1.0000e-03 eta: 2:44:25 time: 0.5860 data_time: 0.0078 memory: 15293 grad_norm: 5.3144 loss: 0.7176 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7176 2023/05/27 16:00:27 - mmengine - INFO - Epoch(train) [33][360/940] lr: 1.0000e-03 eta: 2:44:14 time: 0.5966 data_time: 0.0076 memory: 15293 grad_norm: 4.2647 loss: 0.6474 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6474 2023/05/27 16:00:39 - mmengine - INFO - Epoch(train) [33][380/940] lr: 1.0000e-03 eta: 2:44:01 time: 0.5882 data_time: 0.0073 memory: 15293 grad_norm: 4.5514 loss: 0.6551 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6551 2023/05/27 16:00:51 - mmengine - INFO - Epoch(train) [33][400/940] lr: 1.0000e-03 eta: 2:43:49 time: 0.5904 data_time: 0.0075 memory: 15293 grad_norm: 4.5828 loss: 0.8029 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8029 2023/05/27 16:01:02 - mmengine - INFO - Epoch(train) [33][420/940] lr: 1.0000e-03 eta: 2:43:37 time: 0.5861 data_time: 0.0075 memory: 15293 grad_norm: 4.6283 loss: 0.7078 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7078 2023/05/27 16:01:14 - mmengine - INFO - Epoch(train) [33][440/940] lr: 1.0000e-03 eta: 2:43:25 time: 0.5902 data_time: 0.0075 memory: 15293 grad_norm: 5.1042 loss: 0.5398 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5398 2023/05/27 16:01:26 - mmengine - INFO - Epoch(train) [33][460/940] lr: 1.0000e-03 eta: 2:43:13 time: 0.5865 data_time: 0.0075 memory: 15293 grad_norm: 4.2400 loss: 0.6684 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6684 2023/05/27 16:01:38 - mmengine - INFO - Epoch(train) [33][480/940] lr: 1.0000e-03 eta: 2:43:01 time: 0.5923 data_time: 0.0075 memory: 15293 grad_norm: 4.7370 loss: 0.7169 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7169 2023/05/27 16:01:49 - mmengine - INFO - Epoch(train) [33][500/940] lr: 1.0000e-03 eta: 2:42:49 time: 0.5902 data_time: 0.0074 memory: 15293 grad_norm: 4.9825 loss: 0.5615 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5615 2023/05/27 16:02:01 - mmengine - INFO - Epoch(train) [33][520/940] lr: 1.0000e-03 eta: 2:42:37 time: 0.5882 data_time: 0.0078 memory: 15293 grad_norm: 4.4457 loss: 0.7206 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7206 2023/05/27 16:02:13 - mmengine - INFO - Epoch(train) [33][540/940] lr: 1.0000e-03 eta: 2:42:25 time: 0.5869 data_time: 0.0076 memory: 15293 grad_norm: 4.3279 loss: 0.7342 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7342 2023/05/27 16:02:25 - mmengine - INFO - Epoch(train) [33][560/940] lr: 1.0000e-03 eta: 2:42:13 time: 0.5996 data_time: 0.0074 memory: 15293 grad_norm: 4.4088 loss: 0.4635 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4635 2023/05/27 16:02:37 - mmengine - INFO - Epoch(train) [33][580/940] lr: 1.0000e-03 eta: 2:42:01 time: 0.5934 data_time: 0.0077 memory: 15293 grad_norm: 4.3805 loss: 0.5383 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5383 2023/05/27 16:02:49 - mmengine - INFO - Epoch(train) [33][600/940] lr: 1.0000e-03 eta: 2:41:49 time: 0.5905 data_time: 0.0076 memory: 15293 grad_norm: 4.5648 loss: 0.6407 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6407 2023/05/27 16:03:00 - mmengine - INFO - Epoch(train) [33][620/940] lr: 1.0000e-03 eta: 2:41:37 time: 0.5883 data_time: 0.0076 memory: 15293 grad_norm: 4.4131 loss: 0.8266 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8266 2023/05/27 16:03:12 - mmengine - INFO - Epoch(train) [33][640/940] lr: 1.0000e-03 eta: 2:41:25 time: 0.5931 data_time: 0.0076 memory: 15293 grad_norm: 4.0824 loss: 0.6533 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6533 2023/05/27 16:03:24 - mmengine - INFO - Epoch(train) [33][660/940] lr: 1.0000e-03 eta: 2:41:13 time: 0.5956 data_time: 0.0075 memory: 15293 grad_norm: 4.1533 loss: 0.4898 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4898 2023/05/27 16:03:36 - mmengine - INFO - Epoch(train) [33][680/940] lr: 1.0000e-03 eta: 2:41:01 time: 0.5961 data_time: 0.0075 memory: 15293 grad_norm: 4.4351 loss: 0.7796 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7796 2023/05/27 16:03:48 - mmengine - INFO - Epoch(train) [33][700/940] lr: 1.0000e-03 eta: 2:40:49 time: 0.5865 data_time: 0.0075 memory: 15293 grad_norm: 4.8061 loss: 0.7918 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.7918 2023/05/27 16:04:00 - mmengine - INFO - Epoch(train) [33][720/940] lr: 1.0000e-03 eta: 2:40:37 time: 0.5899 data_time: 0.0075 memory: 15293 grad_norm: 4.5663 loss: 0.4813 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4813 2023/05/27 16:04:12 - mmengine - INFO - Epoch(train) [33][740/940] lr: 1.0000e-03 eta: 2:40:25 time: 0.6013 data_time: 0.0074 memory: 15293 grad_norm: 4.7641 loss: 0.6270 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6270 2023/05/27 16:04:23 - mmengine - INFO - Epoch(train) [33][760/940] lr: 1.0000e-03 eta: 2:40:13 time: 0.5868 data_time: 0.0075 memory: 15293 grad_norm: 4.9333 loss: 0.8544 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8544 2023/05/27 16:04:35 - mmengine - INFO - Epoch(train) [33][780/940] lr: 1.0000e-03 eta: 2:40:01 time: 0.5923 data_time: 0.0073 memory: 15293 grad_norm: 5.1924 loss: 0.4463 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4463 2023/05/27 16:04:47 - mmengine - INFO - Epoch(train) [33][800/940] lr: 1.0000e-03 eta: 2:39:50 time: 0.6039 data_time: 0.0075 memory: 15293 grad_norm: 4.7752 loss: 0.7057 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7057 2023/05/27 16:04:59 - mmengine - INFO - Epoch(train) [33][820/940] lr: 1.0000e-03 eta: 2:39:37 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.6783 loss: 0.5955 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5955 2023/05/27 16:05:11 - mmengine - INFO - Epoch(train) [33][840/940] lr: 1.0000e-03 eta: 2:39:25 time: 0.5867 data_time: 0.0079 memory: 15293 grad_norm: 4.9302 loss: 0.6250 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6250 2023/05/27 16:05:23 - mmengine - INFO - Epoch(train) [33][860/940] lr: 1.0000e-03 eta: 2:39:13 time: 0.5917 data_time: 0.0077 memory: 15293 grad_norm: 4.4782 loss: 0.4734 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4734 2023/05/27 16:05:34 - mmengine - INFO - Epoch(train) [33][880/940] lr: 1.0000e-03 eta: 2:39:01 time: 0.5850 data_time: 0.0078 memory: 15293 grad_norm: 5.1325 loss: 0.6025 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6025 2023/05/27 16:05:46 - mmengine - INFO - Epoch(train) [33][900/940] lr: 1.0000e-03 eta: 2:38:49 time: 0.5948 data_time: 0.0075 memory: 15293 grad_norm: 4.5575 loss: 0.6542 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6542 2023/05/27 16:05:58 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:05:58 - mmengine - INFO - Epoch(train) [33][920/940] lr: 1.0000e-03 eta: 2:38:37 time: 0.5861 data_time: 0.0077 memory: 15293 grad_norm: 4.6406 loss: 0.5299 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5299 2023/05/27 16:06:09 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:06:09 - mmengine - INFO - Epoch(train) [33][940/940] lr: 1.0000e-03 eta: 2:38:24 time: 0.5685 data_time: 0.0072 memory: 15293 grad_norm: 4.6064 loss: 0.6541 top1_acc: 0.5000 top5_acc: 0.5000 loss_cls: 0.6541 2023/05/27 16:06:09 - mmengine - INFO - Saving checkpoint at 33 epochs 2023/05/27 16:06:17 - mmengine - INFO - Epoch(val) [33][20/78] eta: 0:00:17 time: 0.2964 data_time: 0.1119 memory: 2851 2023/05/27 16:06:22 - mmengine - INFO - Epoch(val) [33][40/78] eta: 0:00:09 time: 0.2097 data_time: 0.0249 memory: 2851 2023/05/27 16:06:26 - mmengine - INFO - Epoch(val) [33][60/78] eta: 0:00:04 time: 0.2256 data_time: 0.0411 memory: 2851 2023/05/27 16:06:57 - mmengine - INFO - Epoch(val) [33][78/78] acc/top1: 0.7700 acc/top5: 0.9300 acc/mean1: 0.7698 data_time: 0.0463 time: 0.2279 2023/05/27 16:07:11 - mmengine - INFO - Epoch(train) [34][ 20/940] lr: 1.0000e-03 eta: 2:38:15 time: 0.6811 data_time: 0.0774 memory: 15293 grad_norm: 4.4450 loss: 0.5924 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5924 2023/05/27 16:07:22 - mmengine - INFO - Epoch(train) [34][ 40/940] lr: 1.0000e-03 eta: 2:38:03 time: 0.5842 data_time: 0.0075 memory: 15293 grad_norm: 4.6354 loss: 0.6740 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6740 2023/05/27 16:07:34 - mmengine - INFO - Epoch(train) [34][ 60/940] lr: 1.0000e-03 eta: 2:37:51 time: 0.5952 data_time: 0.0073 memory: 15293 grad_norm: 4.8537 loss: 0.7736 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7736 2023/05/27 16:07:46 - mmengine - INFO - Epoch(train) [34][ 80/940] lr: 1.0000e-03 eta: 2:37:39 time: 0.5876 data_time: 0.0077 memory: 15293 grad_norm: 4.2936 loss: 0.6463 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6463 2023/05/27 16:07:58 - mmengine - INFO - Epoch(train) [34][100/940] lr: 1.0000e-03 eta: 2:37:27 time: 0.5872 data_time: 0.0073 memory: 15293 grad_norm: 4.3115 loss: 0.6452 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6452 2023/05/27 16:08:10 - mmengine - INFO - Epoch(train) [34][120/940] lr: 1.0000e-03 eta: 2:37:15 time: 0.5859 data_time: 0.0074 memory: 15293 grad_norm: 4.4181 loss: 0.7488 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7488 2023/05/27 16:08:21 - mmengine - INFO - Epoch(train) [34][140/940] lr: 1.0000e-03 eta: 2:37:02 time: 0.5867 data_time: 0.0077 memory: 15293 grad_norm: 4.4859 loss: 0.7948 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7948 2023/05/27 16:08:33 - mmengine - INFO - Epoch(train) [34][160/940] lr: 1.0000e-03 eta: 2:36:50 time: 0.5920 data_time: 0.0076 memory: 15293 grad_norm: 4.8313 loss: 0.6041 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6041 2023/05/27 16:08:45 - mmengine - INFO - Epoch(train) [34][180/940] lr: 1.0000e-03 eta: 2:36:38 time: 0.5907 data_time: 0.0076 memory: 15293 grad_norm: 4.3408 loss: 0.6155 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6155 2023/05/27 16:08:57 - mmengine - INFO - Epoch(train) [34][200/940] lr: 1.0000e-03 eta: 2:36:27 time: 0.5964 data_time: 0.0075 memory: 15293 grad_norm: 4.2386 loss: 0.7523 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7523 2023/05/27 16:09:09 - mmengine - INFO - Epoch(train) [34][220/940] lr: 1.0000e-03 eta: 2:36:15 time: 0.5946 data_time: 0.0076 memory: 15293 grad_norm: 4.6952 loss: 0.5353 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5353 2023/05/27 16:09:21 - mmengine - INFO - Epoch(train) [34][240/940] lr: 1.0000e-03 eta: 2:36:03 time: 0.5865 data_time: 0.0074 memory: 15293 grad_norm: 5.4110 loss: 0.4978 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4978 2023/05/27 16:09:32 - mmengine - INFO - Epoch(train) [34][260/940] lr: 1.0000e-03 eta: 2:35:50 time: 0.5890 data_time: 0.0075 memory: 15293 grad_norm: 4.5705 loss: 0.6201 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6201 2023/05/27 16:09:44 - mmengine - INFO - Epoch(train) [34][280/940] lr: 1.0000e-03 eta: 2:35:38 time: 0.5862 data_time: 0.0074 memory: 15293 grad_norm: 4.3338 loss: 0.7556 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7556 2023/05/27 16:09:56 - mmengine - INFO - Epoch(train) [34][300/940] lr: 1.0000e-03 eta: 2:35:26 time: 0.5939 data_time: 0.0079 memory: 15293 grad_norm: 4.5562 loss: 0.8885 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.8885 2023/05/27 16:10:08 - mmengine - INFO - Epoch(train) [34][320/940] lr: 1.0000e-03 eta: 2:35:14 time: 0.5907 data_time: 0.0076 memory: 15293 grad_norm: 4.6237 loss: 0.5304 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5304 2023/05/27 16:10:20 - mmengine - INFO - Epoch(train) [34][340/940] lr: 1.0000e-03 eta: 2:35:02 time: 0.5918 data_time: 0.0073 memory: 15293 grad_norm: 4.3924 loss: 0.7233 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7233 2023/05/27 16:10:31 - mmengine - INFO - Epoch(train) [34][360/940] lr: 1.0000e-03 eta: 2:34:50 time: 0.5900 data_time: 0.0077 memory: 15293 grad_norm: 4.3823 loss: 0.5727 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5727 2023/05/27 16:10:43 - mmengine - INFO - Epoch(train) [34][380/940] lr: 1.0000e-03 eta: 2:34:38 time: 0.5869 data_time: 0.0077 memory: 15293 grad_norm: 4.3873 loss: 0.5253 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5253 2023/05/27 16:10:55 - mmengine - INFO - Epoch(train) [34][400/940] lr: 1.0000e-03 eta: 2:34:26 time: 0.5876 data_time: 0.0075 memory: 15293 grad_norm: 4.6323 loss: 0.5031 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5031 2023/05/27 16:11:07 - mmengine - INFO - Epoch(train) [34][420/940] lr: 1.0000e-03 eta: 2:34:14 time: 0.5941 data_time: 0.0078 memory: 15293 grad_norm: 4.4000 loss: 0.5504 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5504 2023/05/27 16:11:19 - mmengine - INFO - Epoch(train) [34][440/940] lr: 1.0000e-03 eta: 2:34:02 time: 0.5974 data_time: 0.0076 memory: 15293 grad_norm: 5.1162 loss: 0.6111 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6111 2023/05/27 16:11:31 - mmengine - INFO - Epoch(train) [34][460/940] lr: 1.0000e-03 eta: 2:33:50 time: 0.5909 data_time: 0.0076 memory: 15293 grad_norm: 6.4669 loss: 0.5293 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5293 2023/05/27 16:11:42 - mmengine - INFO - Epoch(train) [34][480/940] lr: 1.0000e-03 eta: 2:33:39 time: 0.5917 data_time: 0.0076 memory: 15293 grad_norm: 4.4165 loss: 0.5945 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5945 2023/05/27 16:11:54 - mmengine - INFO - Epoch(train) [34][500/940] lr: 1.0000e-03 eta: 2:33:26 time: 0.5864 data_time: 0.0076 memory: 15293 grad_norm: 4.6218 loss: 0.6195 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6195 2023/05/27 16:12:06 - mmengine - INFO - Epoch(train) [34][520/940] lr: 1.0000e-03 eta: 2:33:14 time: 0.5866 data_time: 0.0076 memory: 15293 grad_norm: 4.4893 loss: 0.6597 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6597 2023/05/27 16:12:18 - mmengine - INFO - Epoch(train) [34][540/940] lr: 1.0000e-03 eta: 2:33:02 time: 0.5964 data_time: 0.0075 memory: 15293 grad_norm: 4.2348 loss: 0.6692 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6692 2023/05/27 16:12:30 - mmengine - INFO - Epoch(train) [34][560/940] lr: 1.0000e-03 eta: 2:32:50 time: 0.5927 data_time: 0.0075 memory: 15293 grad_norm: 6.7080 loss: 0.7226 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7226 2023/05/27 16:12:41 - mmengine - INFO - Epoch(train) [34][580/940] lr: 1.0000e-03 eta: 2:32:38 time: 0.5870 data_time: 0.0075 memory: 15293 grad_norm: 4.4403 loss: 0.7513 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7513 2023/05/27 16:12:53 - mmengine - INFO - Epoch(train) [34][600/940] lr: 1.0000e-03 eta: 2:32:26 time: 0.5932 data_time: 0.0077 memory: 15293 grad_norm: 4.7593 loss: 0.5820 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5820 2023/05/27 16:13:05 - mmengine - INFO - Epoch(train) [34][620/940] lr: 1.0000e-03 eta: 2:32:15 time: 0.6030 data_time: 0.0074 memory: 15293 grad_norm: 4.4255 loss: 0.5958 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5958 2023/05/27 16:13:17 - mmengine - INFO - Epoch(train) [34][640/940] lr: 1.0000e-03 eta: 2:32:03 time: 0.6025 data_time: 0.0074 memory: 15293 grad_norm: 4.5804 loss: 0.8437 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8437 2023/05/27 16:13:29 - mmengine - INFO - Epoch(train) [34][660/940] lr: 1.0000e-03 eta: 2:31:51 time: 0.5870 data_time: 0.0076 memory: 15293 grad_norm: 4.2859 loss: 0.6661 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6661 2023/05/27 16:13:41 - mmengine - INFO - Epoch(train) [34][680/940] lr: 1.0000e-03 eta: 2:31:39 time: 0.5906 data_time: 0.0076 memory: 15293 grad_norm: 4.7399 loss: 0.8758 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8758 2023/05/27 16:13:53 - mmengine - INFO - Epoch(train) [34][700/940] lr: 1.0000e-03 eta: 2:31:27 time: 0.5921 data_time: 0.0076 memory: 15293 grad_norm: 4.6559 loss: 0.6502 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6502 2023/05/27 16:14:05 - mmengine - INFO - Epoch(train) [34][720/940] lr: 1.0000e-03 eta: 2:31:15 time: 0.5883 data_time: 0.0077 memory: 15293 grad_norm: 4.3407 loss: 0.6008 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6008 2023/05/27 16:14:16 - mmengine - INFO - Epoch(train) [34][740/940] lr: 1.0000e-03 eta: 2:31:03 time: 0.5913 data_time: 0.0074 memory: 15293 grad_norm: 4.8546 loss: 0.5920 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5920 2023/05/27 16:14:28 - mmengine - INFO - Epoch(train) [34][760/940] lr: 1.0000e-03 eta: 2:30:51 time: 0.5955 data_time: 0.0075 memory: 15293 grad_norm: 4.3649 loss: 0.6011 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6011 2023/05/27 16:14:40 - mmengine - INFO - Epoch(train) [34][780/940] lr: 1.0000e-03 eta: 2:30:39 time: 0.5978 data_time: 0.0076 memory: 15293 grad_norm: 4.8191 loss: 0.6930 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6930 2023/05/27 16:14:52 - mmengine - INFO - Epoch(train) [34][800/940] lr: 1.0000e-03 eta: 2:30:27 time: 0.5875 data_time: 0.0074 memory: 15293 grad_norm: 4.2345 loss: 0.4177 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4177 2023/05/27 16:15:04 - mmengine - INFO - Epoch(train) [34][820/940] lr: 1.0000e-03 eta: 2:30:15 time: 0.5956 data_time: 0.0074 memory: 15293 grad_norm: 4.5280 loss: 0.7663 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.7663 2023/05/27 16:15:16 - mmengine - INFO - Epoch(train) [34][840/940] lr: 1.0000e-03 eta: 2:30:03 time: 0.5924 data_time: 0.0077 memory: 15293 grad_norm: 4.2951 loss: 0.4984 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4984 2023/05/27 16:15:28 - mmengine - INFO - Epoch(train) [34][860/940] lr: 1.0000e-03 eta: 2:29:51 time: 0.5916 data_time: 0.0076 memory: 15293 grad_norm: 4.8352 loss: 0.7547 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7547 2023/05/27 16:15:39 - mmengine - INFO - Epoch(train) [34][880/940] lr: 1.0000e-03 eta: 2:29:39 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.4878 loss: 0.7044 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7044 2023/05/27 16:15:51 - mmengine - INFO - Epoch(train) [34][900/940] lr: 1.0000e-03 eta: 2:29:27 time: 0.5951 data_time: 0.0076 memory: 15293 grad_norm: 4.7329 loss: 0.7619 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7619 2023/05/27 16:16:03 - mmengine - INFO - Epoch(train) [34][920/940] lr: 1.0000e-03 eta: 2:29:16 time: 0.5968 data_time: 0.0077 memory: 15293 grad_norm: 6.8583 loss: 0.6856 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6856 2023/05/27 16:16:14 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:16:14 - mmengine - INFO - Epoch(train) [34][940/940] lr: 1.0000e-03 eta: 2:29:03 time: 0.5667 data_time: 0.0074 memory: 15293 grad_norm: 4.7981 loss: 0.5069 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5069 2023/05/27 16:16:20 - mmengine - INFO - Epoch(val) [34][20/78] eta: 0:00:16 time: 0.2906 data_time: 0.1054 memory: 2851 2023/05/27 16:16:25 - mmengine - INFO - Epoch(val) [34][40/78] eta: 0:00:09 time: 0.2148 data_time: 0.0294 memory: 2851 2023/05/27 16:16:29 - mmengine - INFO - Epoch(val) [34][60/78] eta: 0:00:04 time: 0.2224 data_time: 0.0374 memory: 2851 2023/05/27 16:16:45 - mmengine - INFO - Epoch(val) [34][78/78] acc/top1: 0.7706 acc/top5: 0.9292 acc/mean1: 0.7704 data_time: 0.0448 time: 0.2269 2023/05/27 16:16:58 - mmengine - INFO - Epoch(train) [35][ 20/940] lr: 1.0000e-03 eta: 2:28:54 time: 0.6924 data_time: 0.0777 memory: 15293 grad_norm: 4.6934 loss: 0.7374 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7374 2023/05/27 16:17:10 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:17:10 - mmengine - INFO - Epoch(train) [35][ 40/940] lr: 1.0000e-03 eta: 2:28:41 time: 0.5857 data_time: 0.0076 memory: 15293 grad_norm: 5.0876 loss: 0.6980 top1_acc: 0.5000 top5_acc: 0.6250 loss_cls: 0.6980 2023/05/27 16:17:22 - mmengine - INFO - Epoch(train) [35][ 60/940] lr: 1.0000e-03 eta: 2:28:30 time: 0.5959 data_time: 0.0074 memory: 15293 grad_norm: 5.2946 loss: 0.5282 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5282 2023/05/27 16:17:34 - mmengine - INFO - Epoch(train) [35][ 80/940] lr: 1.0000e-03 eta: 2:28:17 time: 0.5868 data_time: 0.0076 memory: 15293 grad_norm: 4.7126 loss: 0.4578 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4578 2023/05/27 16:17:46 - mmengine - INFO - Epoch(train) [35][100/940] lr: 1.0000e-03 eta: 2:28:05 time: 0.5871 data_time: 0.0077 memory: 15293 grad_norm: 4.2586 loss: 0.6731 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6731 2023/05/27 16:17:57 - mmengine - INFO - Epoch(train) [35][120/940] lr: 1.0000e-03 eta: 2:27:53 time: 0.5898 data_time: 0.0075 memory: 15293 grad_norm: 4.4274 loss: 0.6405 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6405 2023/05/27 16:18:09 - mmengine - INFO - Epoch(train) [35][140/940] lr: 1.0000e-03 eta: 2:27:41 time: 0.5955 data_time: 0.0075 memory: 15293 grad_norm: 4.5814 loss: 0.5764 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5764 2023/05/27 16:18:21 - mmengine - INFO - Epoch(train) [35][160/940] lr: 1.0000e-03 eta: 2:27:30 time: 0.6073 data_time: 0.0073 memory: 15293 grad_norm: 4.7563 loss: 0.5535 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5535 2023/05/27 16:18:33 - mmengine - INFO - Epoch(train) [35][180/940] lr: 1.0000e-03 eta: 2:27:18 time: 0.5932 data_time: 0.0074 memory: 15293 grad_norm: 4.0744 loss: 0.7355 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7355 2023/05/27 16:18:45 - mmengine - INFO - Epoch(train) [35][200/940] lr: 1.0000e-03 eta: 2:27:06 time: 0.5913 data_time: 0.0073 memory: 15293 grad_norm: 4.5011 loss: 0.4486 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4486 2023/05/27 16:18:57 - mmengine - INFO - Epoch(train) [35][220/940] lr: 1.0000e-03 eta: 2:26:54 time: 0.5966 data_time: 0.0074 memory: 15293 grad_norm: 4.3830 loss: 0.6155 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6155 2023/05/27 16:19:09 - mmengine - INFO - Epoch(train) [35][240/940] lr: 1.0000e-03 eta: 2:26:42 time: 0.5947 data_time: 0.0073 memory: 15293 grad_norm: 4.4108 loss: 0.6155 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6155 2023/05/27 16:19:21 - mmengine - INFO - Epoch(train) [35][260/940] lr: 1.0000e-03 eta: 2:26:30 time: 0.5929 data_time: 0.0075 memory: 15293 grad_norm: 4.6805 loss: 0.6927 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6927 2023/05/27 16:19:33 - mmengine - INFO - Epoch(train) [35][280/940] lr: 1.0000e-03 eta: 2:26:18 time: 0.5920 data_time: 0.0074 memory: 15293 grad_norm: 4.7142 loss: 0.5588 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5588 2023/05/27 16:19:45 - mmengine - INFO - Epoch(train) [35][300/940] lr: 1.0000e-03 eta: 2:26:06 time: 0.5933 data_time: 0.0074 memory: 15293 grad_norm: 4.3507 loss: 0.4959 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4959 2023/05/27 16:19:57 - mmengine - INFO - Epoch(train) [35][320/940] lr: 1.0000e-03 eta: 2:25:55 time: 0.5984 data_time: 0.0077 memory: 15293 grad_norm: 4.4760 loss: 0.5152 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5152 2023/05/27 16:20:08 - mmengine - INFO - Epoch(train) [35][340/940] lr: 1.0000e-03 eta: 2:25:42 time: 0.5873 data_time: 0.0075 memory: 15293 grad_norm: 4.6309 loss: 0.5076 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5076 2023/05/27 16:20:20 - mmengine - INFO - Epoch(train) [35][360/940] lr: 1.0000e-03 eta: 2:25:31 time: 0.5944 data_time: 0.0076 memory: 15293 grad_norm: 4.3809 loss: 0.6818 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6818 2023/05/27 16:20:32 - mmengine - INFO - Epoch(train) [35][380/940] lr: 1.0000e-03 eta: 2:25:19 time: 0.5943 data_time: 0.0075 memory: 15293 grad_norm: 4.4783 loss: 0.7255 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7255 2023/05/27 16:20:44 - mmengine - INFO - Epoch(train) [35][400/940] lr: 1.0000e-03 eta: 2:25:07 time: 0.5895 data_time: 0.0075 memory: 15293 grad_norm: 4.2326 loss: 0.6116 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6116 2023/05/27 16:20:56 - mmengine - INFO - Epoch(train) [35][420/940] lr: 1.0000e-03 eta: 2:24:55 time: 0.5936 data_time: 0.0075 memory: 15293 grad_norm: 4.4150 loss: 0.6510 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6510 2023/05/27 16:21:08 - mmengine - INFO - Epoch(train) [35][440/940] lr: 1.0000e-03 eta: 2:24:43 time: 0.5940 data_time: 0.0076 memory: 15293 grad_norm: 4.6127 loss: 0.8348 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8348 2023/05/27 16:21:19 - mmengine - INFO - Epoch(train) [35][460/940] lr: 1.0000e-03 eta: 2:24:31 time: 0.5943 data_time: 0.0076 memory: 15293 grad_norm: 4.8396 loss: 0.7105 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7105 2023/05/27 16:21:31 - mmengine - INFO - Epoch(train) [35][480/940] lr: 1.0000e-03 eta: 2:24:19 time: 0.5947 data_time: 0.0075 memory: 15293 grad_norm: 4.4914 loss: 0.6709 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6709 2023/05/27 16:21:43 - mmengine - INFO - Epoch(train) [35][500/940] lr: 1.0000e-03 eta: 2:24:07 time: 0.5902 data_time: 0.0075 memory: 15293 grad_norm: 4.5838 loss: 0.7309 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7309 2023/05/27 16:21:55 - mmengine - INFO - Epoch(train) [35][520/940] lr: 1.0000e-03 eta: 2:23:55 time: 0.5891 data_time: 0.0075 memory: 15293 grad_norm: 4.2677 loss: 0.5310 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5310 2023/05/27 16:22:07 - mmengine - INFO - Epoch(train) [35][540/940] lr: 1.0000e-03 eta: 2:23:43 time: 0.5883 data_time: 0.0076 memory: 15293 grad_norm: 4.4417 loss: 0.5079 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.5079 2023/05/27 16:22:19 - mmengine - INFO - Epoch(train) [35][560/940] lr: 1.0000e-03 eta: 2:23:31 time: 0.5948 data_time: 0.0073 memory: 15293 grad_norm: 4.3780 loss: 0.6549 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6549 2023/05/27 16:22:31 - mmengine - INFO - Epoch(train) [35][580/940] lr: 1.0000e-03 eta: 2:23:19 time: 0.5961 data_time: 0.0074 memory: 15293 grad_norm: 4.6568 loss: 0.6162 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6162 2023/05/27 16:22:42 - mmengine - INFO - Epoch(train) [35][600/940] lr: 1.0000e-03 eta: 2:23:07 time: 0.5951 data_time: 0.0074 memory: 15293 grad_norm: 4.2563 loss: 0.7249 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7249 2023/05/27 16:22:54 - mmengine - INFO - Epoch(train) [35][620/940] lr: 1.0000e-03 eta: 2:22:55 time: 0.5943 data_time: 0.0078 memory: 15293 grad_norm: 4.3419 loss: 0.7042 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7042 2023/05/27 16:23:06 - mmengine - INFO - Epoch(train) [35][640/940] lr: 1.0000e-03 eta: 2:22:43 time: 0.5863 data_time: 0.0075 memory: 15293 grad_norm: 4.8126 loss: 0.6099 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6099 2023/05/27 16:23:18 - mmengine - INFO - Epoch(train) [35][660/940] lr: 1.0000e-03 eta: 2:22:31 time: 0.5898 data_time: 0.0079 memory: 15293 grad_norm: 4.4168 loss: 0.6498 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6498 2023/05/27 16:23:30 - mmengine - INFO - Epoch(train) [35][680/940] lr: 1.0000e-03 eta: 2:22:19 time: 0.5856 data_time: 0.0076 memory: 15293 grad_norm: 4.4118 loss: 0.5108 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.5108 2023/05/27 16:23:41 - mmengine - INFO - Epoch(train) [35][700/940] lr: 1.0000e-03 eta: 2:22:07 time: 0.5921 data_time: 0.0077 memory: 15293 grad_norm: 4.5434 loss: 0.6277 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6277 2023/05/27 16:23:53 - mmengine - INFO - Epoch(train) [35][720/940] lr: 1.0000e-03 eta: 2:21:55 time: 0.5882 data_time: 0.0074 memory: 15293 grad_norm: 4.9344 loss: 0.5665 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5665 2023/05/27 16:24:05 - mmengine - INFO - Epoch(train) [35][740/940] lr: 1.0000e-03 eta: 2:21:43 time: 0.5974 data_time: 0.0077 memory: 15293 grad_norm: 5.7281 loss: 0.6928 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6928 2023/05/27 16:24:17 - mmengine - INFO - Epoch(train) [35][760/940] lr: 1.0000e-03 eta: 2:21:32 time: 0.5941 data_time: 0.0076 memory: 15293 grad_norm: 4.3709 loss: 0.4492 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4492 2023/05/27 16:24:29 - mmengine - INFO - Epoch(train) [35][780/940] lr: 1.0000e-03 eta: 2:21:19 time: 0.5883 data_time: 0.0077 memory: 15293 grad_norm: 4.2920 loss: 0.7717 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7717 2023/05/27 16:24:41 - mmengine - INFO - Epoch(train) [35][800/940] lr: 1.0000e-03 eta: 2:21:07 time: 0.5884 data_time: 0.0075 memory: 15293 grad_norm: 4.6497 loss: 0.5614 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5614 2023/05/27 16:24:52 - mmengine - INFO - Epoch(train) [35][820/940] lr: 1.0000e-03 eta: 2:20:55 time: 0.5883 data_time: 0.0074 memory: 15293 grad_norm: 4.3628 loss: 0.5916 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5916 2023/05/27 16:25:04 - mmengine - INFO - Epoch(train) [35][840/940] lr: 1.0000e-03 eta: 2:20:44 time: 0.5975 data_time: 0.0080 memory: 15293 grad_norm: 4.4813 loss: 0.5746 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5746 2023/05/27 16:25:16 - mmengine - INFO - Epoch(train) [35][860/940] lr: 1.0000e-03 eta: 2:20:32 time: 0.5861 data_time: 0.0079 memory: 15293 grad_norm: 4.3212 loss: 0.5500 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5500 2023/05/27 16:25:28 - mmengine - INFO - Epoch(train) [35][880/940] lr: 1.0000e-03 eta: 2:20:19 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.3767 loss: 0.6569 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6569 2023/05/27 16:25:40 - mmengine - INFO - Epoch(train) [35][900/940] lr: 1.0000e-03 eta: 2:20:08 time: 0.5933 data_time: 0.0075 memory: 15293 grad_norm: 4.6945 loss: 0.5504 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5504 2023/05/27 16:25:51 - mmengine - INFO - Epoch(train) [35][920/940] lr: 1.0000e-03 eta: 2:19:56 time: 0.5888 data_time: 0.0076 memory: 15293 grad_norm: 5.0295 loss: 0.5399 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5399 2023/05/27 16:26:03 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:26:03 - mmengine - INFO - Epoch(train) [35][940/940] lr: 1.0000e-03 eta: 2:19:43 time: 0.5671 data_time: 0.0075 memory: 15293 grad_norm: 4.4569 loss: 0.6359 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6359 2023/05/27 16:26:09 - mmengine - INFO - Epoch(val) [35][20/78] eta: 0:00:17 time: 0.2941 data_time: 0.1087 memory: 2851 2023/05/27 16:26:13 - mmengine - INFO - Epoch(val) [35][40/78] eta: 0:00:09 time: 0.2120 data_time: 0.0264 memory: 2851 2023/05/27 16:26:17 - mmengine - INFO - Epoch(val) [35][60/78] eta: 0:00:04 time: 0.2316 data_time: 0.0469 memory: 2851 2023/05/27 16:29:39 - mmengine - INFO - Epoch(val) [35][78/78] acc/top1: 0.7705 acc/top5: 0.9287 acc/mean1: 0.7704 data_time: 0.0471 time: 0.2293 2023/05/27 16:29:53 - mmengine - INFO - Epoch(train) [36][ 20/940] lr: 1.0000e-03 eta: 2:19:33 time: 0.6881 data_time: 0.0834 memory: 15293 grad_norm: 4.1509 loss: 0.5079 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5079 2023/05/27 16:30:04 - mmengine - INFO - Epoch(train) [36][ 40/940] lr: 1.0000e-03 eta: 2:19:21 time: 0.5864 data_time: 0.0077 memory: 15293 grad_norm: 4.4571 loss: 0.6989 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6989 2023/05/27 16:30:16 - mmengine - INFO - Epoch(train) [36][ 60/940] lr: 1.0000e-03 eta: 2:19:09 time: 0.5888 data_time: 0.0078 memory: 15293 grad_norm: 4.4679 loss: 0.5934 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5934 2023/05/27 16:30:28 - mmengine - INFO - Epoch(train) [36][ 80/940] lr: 1.0000e-03 eta: 2:18:57 time: 0.5964 data_time: 0.0078 memory: 15293 grad_norm: 4.4507 loss: 0.6426 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6426 2023/05/27 16:30:40 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:30:40 - mmengine - INFO - Epoch(train) [36][100/940] lr: 1.0000e-03 eta: 2:18:45 time: 0.5866 data_time: 0.0079 memory: 15293 grad_norm: 4.9038 loss: 0.7714 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7714 2023/05/27 16:30:52 - mmengine - INFO - Epoch(train) [36][120/940] lr: 1.0000e-03 eta: 2:18:33 time: 0.5891 data_time: 0.0078 memory: 15293 grad_norm: 4.7981 loss: 0.5392 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5392 2023/05/27 16:31:03 - mmengine - INFO - Epoch(train) [36][140/940] lr: 1.0000e-03 eta: 2:18:21 time: 0.5904 data_time: 0.0078 memory: 15293 grad_norm: 4.8259 loss: 0.7327 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7327 2023/05/27 16:31:15 - mmengine - INFO - Epoch(train) [36][160/940] lr: 1.0000e-03 eta: 2:18:09 time: 0.5861 data_time: 0.0076 memory: 15293 grad_norm: 4.3432 loss: 0.4670 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4670 2023/05/27 16:31:27 - mmengine - INFO - Epoch(train) [36][180/940] lr: 1.0000e-03 eta: 2:17:57 time: 0.6024 data_time: 0.0079 memory: 15293 grad_norm: 5.0271 loss: 0.6067 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6067 2023/05/27 16:31:39 - mmengine - INFO - Epoch(train) [36][200/940] lr: 1.0000e-03 eta: 2:17:45 time: 0.5977 data_time: 0.0075 memory: 15293 grad_norm: 4.5231 loss: 0.5725 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5725 2023/05/27 16:31:51 - mmengine - INFO - Epoch(train) [36][220/940] lr: 1.0000e-03 eta: 2:17:33 time: 0.5861 data_time: 0.0077 memory: 15293 grad_norm: 4.8745 loss: 0.7343 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7343 2023/05/27 16:32:03 - mmengine - INFO - Epoch(train) [36][240/940] lr: 1.0000e-03 eta: 2:17:22 time: 0.5971 data_time: 0.0078 memory: 15293 grad_norm: 4.2729 loss: 0.8516 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8516 2023/05/27 16:32:15 - mmengine - INFO - Epoch(train) [36][260/940] lr: 1.0000e-03 eta: 2:17:10 time: 0.5925 data_time: 0.0078 memory: 15293 grad_norm: 4.2686 loss: 0.4929 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4929 2023/05/27 16:32:26 - mmengine - INFO - Epoch(train) [36][280/940] lr: 1.0000e-03 eta: 2:16:58 time: 0.5895 data_time: 0.0079 memory: 15293 grad_norm: 4.3295 loss: 0.8029 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8029 2023/05/27 16:32:38 - mmengine - INFO - Epoch(train) [36][300/940] lr: 1.0000e-03 eta: 2:16:46 time: 0.5886 data_time: 0.0081 memory: 15293 grad_norm: 4.2442 loss: 0.5076 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5076 2023/05/27 16:32:50 - mmengine - INFO - Epoch(train) [36][320/940] lr: 1.0000e-03 eta: 2:16:34 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.8167 loss: 0.7496 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7496 2023/05/27 16:33:02 - mmengine - INFO - Epoch(train) [36][340/940] lr: 1.0000e-03 eta: 2:16:22 time: 0.5919 data_time: 0.0078 memory: 15293 grad_norm: 5.4033 loss: 0.4658 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.4658 2023/05/27 16:33:14 - mmengine - INFO - Epoch(train) [36][360/940] lr: 1.0000e-03 eta: 2:16:10 time: 0.5888 data_time: 0.0082 memory: 15293 grad_norm: 4.5907 loss: 0.7228 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7228 2023/05/27 16:33:25 - mmengine - INFO - Epoch(train) [36][380/940] lr: 1.0000e-03 eta: 2:15:58 time: 0.5947 data_time: 0.0079 memory: 15293 grad_norm: 4.8040 loss: 0.5053 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5053 2023/05/27 16:33:37 - mmengine - INFO - Epoch(train) [36][400/940] lr: 1.0000e-03 eta: 2:15:46 time: 0.5862 data_time: 0.0075 memory: 15293 grad_norm: 4.5462 loss: 0.6498 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6498 2023/05/27 16:33:49 - mmengine - INFO - Epoch(train) [36][420/940] lr: 1.0000e-03 eta: 2:15:34 time: 0.5910 data_time: 0.0077 memory: 15293 grad_norm: 4.5503 loss: 0.6804 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6804 2023/05/27 16:34:01 - mmengine - INFO - Epoch(train) [36][440/940] lr: 1.0000e-03 eta: 2:15:22 time: 0.5920 data_time: 0.0080 memory: 15293 grad_norm: 4.5562 loss: 0.6875 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6875 2023/05/27 16:34:13 - mmengine - INFO - Epoch(train) [36][460/940] lr: 1.0000e-03 eta: 2:15:10 time: 0.5945 data_time: 0.0090 memory: 15293 grad_norm: 4.2982 loss: 0.6206 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6206 2023/05/27 16:34:25 - mmengine - INFO - Epoch(train) [36][480/940] lr: 1.0000e-03 eta: 2:14:58 time: 0.5926 data_time: 0.0079 memory: 15293 grad_norm: 4.7868 loss: 0.6154 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6154 2023/05/27 16:34:36 - mmengine - INFO - Epoch(train) [36][500/940] lr: 1.0000e-03 eta: 2:14:46 time: 0.5882 data_time: 0.0075 memory: 15293 grad_norm: 5.2083 loss: 0.8415 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8415 2023/05/27 16:34:48 - mmengine - INFO - Epoch(train) [36][520/940] lr: 1.0000e-03 eta: 2:14:34 time: 0.5889 data_time: 0.0074 memory: 15293 grad_norm: 4.5454 loss: 0.6545 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6545 2023/05/27 16:35:00 - mmengine - INFO - Epoch(train) [36][540/940] lr: 1.0000e-03 eta: 2:14:22 time: 0.5864 data_time: 0.0077 memory: 15293 grad_norm: 4.7443 loss: 0.7823 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7823 2023/05/27 16:35:12 - mmengine - INFO - Epoch(train) [36][560/940] lr: 1.0000e-03 eta: 2:14:10 time: 0.5978 data_time: 0.0078 memory: 15293 grad_norm: 4.7904 loss: 0.5443 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5443 2023/05/27 16:35:24 - mmengine - INFO - Epoch(train) [36][580/940] lr: 1.0000e-03 eta: 2:13:58 time: 0.5916 data_time: 0.0076 memory: 15293 grad_norm: 4.7884 loss: 0.5889 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5889 2023/05/27 16:35:36 - mmengine - INFO - Epoch(train) [36][600/940] lr: 1.0000e-03 eta: 2:13:46 time: 0.5962 data_time: 0.0077 memory: 15293 grad_norm: 4.4896 loss: 0.4116 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4116 2023/05/27 16:35:47 - mmengine - INFO - Epoch(train) [36][620/940] lr: 1.0000e-03 eta: 2:13:34 time: 0.5904 data_time: 0.0076 memory: 15293 grad_norm: 4.7183 loss: 0.5999 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5999 2023/05/27 16:35:59 - mmengine - INFO - Epoch(train) [36][640/940] lr: 1.0000e-03 eta: 2:13:22 time: 0.5931 data_time: 0.0078 memory: 15293 grad_norm: 4.7114 loss: 0.6311 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6311 2023/05/27 16:36:11 - mmengine - INFO - Epoch(train) [36][660/940] lr: 1.0000e-03 eta: 2:13:10 time: 0.5872 data_time: 0.0080 memory: 15293 grad_norm: 4.6522 loss: 0.8229 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.8229 2023/05/27 16:36:23 - mmengine - INFO - Epoch(train) [36][680/940] lr: 1.0000e-03 eta: 2:12:58 time: 0.5912 data_time: 0.0079 memory: 15293 grad_norm: 4.6269 loss: 0.7320 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7320 2023/05/27 16:36:35 - mmengine - INFO - Epoch(train) [36][700/940] lr: 1.0000e-03 eta: 2:12:46 time: 0.5929 data_time: 0.0088 memory: 15293 grad_norm: 4.5072 loss: 0.6731 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6731 2023/05/27 16:36:47 - mmengine - INFO - Epoch(train) [36][720/940] lr: 1.0000e-03 eta: 2:12:34 time: 0.5920 data_time: 0.0084 memory: 15293 grad_norm: 4.7458 loss: 0.5322 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5322 2023/05/27 16:36:58 - mmengine - INFO - Epoch(train) [36][740/940] lr: 1.0000e-03 eta: 2:12:23 time: 0.5923 data_time: 0.0082 memory: 15293 grad_norm: 4.5189 loss: 0.7417 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7417 2023/05/27 16:37:10 - mmengine - INFO - Epoch(train) [36][760/940] lr: 1.0000e-03 eta: 2:12:11 time: 0.5903 data_time: 0.0077 memory: 15293 grad_norm: 4.4558 loss: 0.5496 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5496 2023/05/27 16:37:22 - mmengine - INFO - Epoch(train) [36][780/940] lr: 1.0000e-03 eta: 2:11:59 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.4138 loss: 0.5324 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.5324 2023/05/27 16:37:34 - mmengine - INFO - Epoch(train) [36][800/940] lr: 1.0000e-03 eta: 2:11:47 time: 0.5877 data_time: 0.0079 memory: 15293 grad_norm: 4.4115 loss: 0.5818 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5818 2023/05/27 16:37:46 - mmengine - INFO - Epoch(train) [36][820/940] lr: 1.0000e-03 eta: 2:11:35 time: 0.5964 data_time: 0.0077 memory: 15293 grad_norm: 4.8924 loss: 0.5793 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5793 2023/05/27 16:37:58 - mmengine - INFO - Epoch(train) [36][840/940] lr: 1.0000e-03 eta: 2:11:23 time: 0.5920 data_time: 0.0074 memory: 15293 grad_norm: 5.5842 loss: 0.6455 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6455 2023/05/27 16:38:09 - mmengine - INFO - Epoch(train) [36][860/940] lr: 1.0000e-03 eta: 2:11:11 time: 0.5910 data_time: 0.0077 memory: 15293 grad_norm: 4.6043 loss: 0.6238 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6238 2023/05/27 16:38:21 - mmengine - INFO - Epoch(train) [36][880/940] lr: 1.0000e-03 eta: 2:10:59 time: 0.5866 data_time: 0.0078 memory: 15293 grad_norm: 4.8223 loss: 0.4335 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4335 2023/05/27 16:38:33 - mmengine - INFO - Epoch(train) [36][900/940] lr: 1.0000e-03 eta: 2:10:47 time: 0.5938 data_time: 0.0077 memory: 15293 grad_norm: 4.8070 loss: 0.6124 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6124 2023/05/27 16:38:45 - mmengine - INFO - Epoch(train) [36][920/940] lr: 1.0000e-03 eta: 2:10:35 time: 0.5864 data_time: 0.0077 memory: 15293 grad_norm: 5.3300 loss: 0.4965 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4965 2023/05/27 16:38:56 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:38:56 - mmengine - INFO - Epoch(train) [36][940/940] lr: 1.0000e-03 eta: 2:10:22 time: 0.5690 data_time: 0.0075 memory: 15293 grad_norm: 5.3749 loss: 0.8074 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.8074 2023/05/27 16:38:56 - mmengine - INFO - Saving checkpoint at 36 epochs 2023/05/27 16:39:04 - mmengine - INFO - Epoch(val) [36][20/78] eta: 0:00:17 time: 0.3019 data_time: 0.1174 memory: 2851 2023/05/27 16:39:09 - mmengine - INFO - Epoch(val) [36][40/78] eta: 0:00:09 time: 0.2196 data_time: 0.0350 memory: 2851 2023/05/27 16:39:13 - mmengine - INFO - Epoch(val) [36][60/78] eta: 0:00:04 time: 0.2238 data_time: 0.0395 memory: 2851 2023/05/27 16:39:18 - mmengine - INFO - Epoch(val) [36][78/78] acc/top1: 0.7717 acc/top5: 0.9297 acc/mean1: 0.7716 data_time: 0.0495 time: 0.2310 2023/05/27 16:39:18 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_28.pth is removed 2023/05/27 16:39:19 - mmengine - INFO - The best checkpoint with 0.7717 acc/top1 at 36 epoch is saved to best_acc_top1_epoch_36.pth. 2023/05/27 16:39:32 - mmengine - INFO - Epoch(train) [37][ 20/940] lr: 1.0000e-03 eta: 2:10:12 time: 0.6536 data_time: 0.0700 memory: 15293 grad_norm: 5.5520 loss: 0.5810 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5810 2023/05/27 16:39:44 - mmengine - INFO - Epoch(train) [37][ 40/940] lr: 1.0000e-03 eta: 2:10:00 time: 0.5929 data_time: 0.0073 memory: 15293 grad_norm: 4.6361 loss: 0.4957 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4957 2023/05/27 16:39:56 - mmengine - INFO - Epoch(train) [37][ 60/940] lr: 1.0000e-03 eta: 2:09:48 time: 0.5943 data_time: 0.0078 memory: 15293 grad_norm: 5.0796 loss: 0.7030 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7030 2023/05/27 16:40:08 - mmengine - INFO - Epoch(train) [37][ 80/940] lr: 1.0000e-03 eta: 2:09:36 time: 0.5939 data_time: 0.0075 memory: 15293 grad_norm: 4.4836 loss: 0.5368 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5368 2023/05/27 16:40:20 - mmengine - INFO - Epoch(train) [37][100/940] lr: 1.0000e-03 eta: 2:09:24 time: 0.5970 data_time: 0.0074 memory: 15293 grad_norm: 4.9695 loss: 0.5745 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5745 2023/05/27 16:40:32 - mmengine - INFO - Epoch(train) [37][120/940] lr: 1.0000e-03 eta: 2:09:12 time: 0.5968 data_time: 0.0076 memory: 15293 grad_norm: 5.6976 loss: 0.6265 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6265 2023/05/27 16:40:44 - mmengine - INFO - Epoch(train) [37][140/940] lr: 1.0000e-03 eta: 2:09:00 time: 0.5954 data_time: 0.0074 memory: 15293 grad_norm: 4.7952 loss: 0.4981 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4981 2023/05/27 16:40:56 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:40:56 - mmengine - INFO - Epoch(train) [37][160/940] lr: 1.0000e-03 eta: 2:08:48 time: 0.5877 data_time: 0.0081 memory: 15293 grad_norm: 4.5109 loss: 0.5934 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5934 2023/05/27 16:41:07 - mmengine - INFO - Epoch(train) [37][180/940] lr: 1.0000e-03 eta: 2:08:36 time: 0.5918 data_time: 0.0079 memory: 15293 grad_norm: 4.9350 loss: 0.5864 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5864 2023/05/27 16:41:19 - mmengine - INFO - Epoch(train) [37][200/940] lr: 1.0000e-03 eta: 2:08:25 time: 0.5961 data_time: 0.0077 memory: 15293 grad_norm: 4.9913 loss: 0.7005 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7005 2023/05/27 16:41:31 - mmengine - INFO - Epoch(train) [37][220/940] lr: 1.0000e-03 eta: 2:08:13 time: 0.5903 data_time: 0.0077 memory: 15293 grad_norm: 4.5949 loss: 0.6296 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6296 2023/05/27 16:41:43 - mmengine - INFO - Epoch(train) [37][240/940] lr: 1.0000e-03 eta: 2:08:01 time: 0.5864 data_time: 0.0074 memory: 15293 grad_norm: 4.3628 loss: 0.7402 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7402 2023/05/27 16:41:55 - mmengine - INFO - Epoch(train) [37][260/940] lr: 1.0000e-03 eta: 2:07:49 time: 0.5865 data_time: 0.0077 memory: 15293 grad_norm: 5.0886 loss: 0.4061 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4061 2023/05/27 16:42:06 - mmengine - INFO - Epoch(train) [37][280/940] lr: 1.0000e-03 eta: 2:07:37 time: 0.5919 data_time: 0.0076 memory: 15293 grad_norm: 5.0823 loss: 0.6236 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6236 2023/05/27 16:42:18 - mmengine - INFO - Epoch(train) [37][300/940] lr: 1.0000e-03 eta: 2:07:25 time: 0.5874 data_time: 0.0076 memory: 15293 grad_norm: 5.1194 loss: 0.4965 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4965 2023/05/27 16:42:30 - mmengine - INFO - Epoch(train) [37][320/940] lr: 1.0000e-03 eta: 2:07:13 time: 0.5876 data_time: 0.0077 memory: 15293 grad_norm: 4.5770 loss: 0.5397 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5397 2023/05/27 16:42:42 - mmengine - INFO - Epoch(train) [37][340/940] lr: 1.0000e-03 eta: 2:07:01 time: 0.5953 data_time: 0.0075 memory: 15293 grad_norm: 7.0438 loss: 0.5521 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5521 2023/05/27 16:42:54 - mmengine - INFO - Epoch(train) [37][360/940] lr: 1.0000e-03 eta: 2:06:49 time: 0.5946 data_time: 0.0076 memory: 15293 grad_norm: 5.0225 loss: 0.7632 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7632 2023/05/27 16:43:05 - mmengine - INFO - Epoch(train) [37][380/940] lr: 1.0000e-03 eta: 2:06:37 time: 0.5862 data_time: 0.0077 memory: 15293 grad_norm: 5.1224 loss: 0.5083 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5083 2023/05/27 16:43:17 - mmengine - INFO - Epoch(train) [37][400/940] lr: 1.0000e-03 eta: 2:06:25 time: 0.5910 data_time: 0.0075 memory: 15293 grad_norm: 4.6161 loss: 0.6790 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6790 2023/05/27 16:43:29 - mmengine - INFO - Epoch(train) [37][420/940] lr: 1.0000e-03 eta: 2:06:13 time: 0.5921 data_time: 0.0074 memory: 15293 grad_norm: 4.5410 loss: 0.5193 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5193 2023/05/27 16:43:41 - mmengine - INFO - Epoch(train) [37][440/940] lr: 1.0000e-03 eta: 2:06:01 time: 0.5861 data_time: 0.0073 memory: 15293 grad_norm: 5.0958 loss: 0.7125 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7125 2023/05/27 16:43:53 - mmengine - INFO - Epoch(train) [37][460/940] lr: 1.0000e-03 eta: 2:05:49 time: 0.5960 data_time: 0.0076 memory: 15293 grad_norm: 4.6139 loss: 0.5060 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5060 2023/05/27 16:44:05 - mmengine - INFO - Epoch(train) [37][480/940] lr: 1.0000e-03 eta: 2:05:37 time: 0.5932 data_time: 0.0074 memory: 15293 grad_norm: 4.3165 loss: 0.7071 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7071 2023/05/27 16:44:16 - mmengine - INFO - Epoch(train) [37][500/940] lr: 1.0000e-03 eta: 2:05:25 time: 0.5905 data_time: 0.0076 memory: 15293 grad_norm: 4.3402 loss: 0.6173 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6173 2023/05/27 16:44:28 - mmengine - INFO - Epoch(train) [37][520/940] lr: 1.0000e-03 eta: 2:05:13 time: 0.5914 data_time: 0.0073 memory: 15293 grad_norm: 5.0514 loss: 0.5736 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5736 2023/05/27 16:44:40 - mmengine - INFO - Epoch(train) [37][540/940] lr: 1.0000e-03 eta: 2:05:01 time: 0.5969 data_time: 0.0079 memory: 15293 grad_norm: 4.6167 loss: 0.6412 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6412 2023/05/27 16:44:52 - mmengine - INFO - Epoch(train) [37][560/940] lr: 1.0000e-03 eta: 2:04:49 time: 0.5945 data_time: 0.0077 memory: 15293 grad_norm: 4.7578 loss: 0.7970 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7970 2023/05/27 16:45:04 - mmengine - INFO - Epoch(train) [37][580/940] lr: 1.0000e-03 eta: 2:04:38 time: 0.5910 data_time: 0.0075 memory: 15293 grad_norm: 4.9049 loss: 0.7284 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7284 2023/05/27 16:45:16 - mmengine - INFO - Epoch(train) [37][600/940] lr: 1.0000e-03 eta: 2:04:25 time: 0.5859 data_time: 0.0076 memory: 15293 grad_norm: 4.6648 loss: 0.4068 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4068 2023/05/27 16:45:27 - mmengine - INFO - Epoch(train) [37][620/940] lr: 1.0000e-03 eta: 2:04:13 time: 0.5891 data_time: 0.0076 memory: 15293 grad_norm: 5.4832 loss: 0.5459 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5459 2023/05/27 16:45:39 - mmengine - INFO - Epoch(train) [37][640/940] lr: 1.0000e-03 eta: 2:04:01 time: 0.5871 data_time: 0.0074 memory: 15293 grad_norm: 4.6960 loss: 0.6302 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6302 2023/05/27 16:45:51 - mmengine - INFO - Epoch(train) [37][660/940] lr: 1.0000e-03 eta: 2:03:50 time: 0.5944 data_time: 0.0077 memory: 15293 grad_norm: 4.6423 loss: 0.6061 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6061 2023/05/27 16:46:03 - mmengine - INFO - Epoch(train) [37][680/940] lr: 1.0000e-03 eta: 2:03:38 time: 0.5873 data_time: 0.0077 memory: 15293 grad_norm: 4.3714 loss: 0.5699 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5699 2023/05/27 16:46:15 - mmengine - INFO - Epoch(train) [37][700/940] lr: 1.0000e-03 eta: 2:03:26 time: 0.5951 data_time: 0.0075 memory: 15293 grad_norm: 4.4348 loss: 0.6211 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6211 2023/05/27 16:46:26 - mmengine - INFO - Epoch(train) [37][720/940] lr: 1.0000e-03 eta: 2:03:14 time: 0.5864 data_time: 0.0080 memory: 15293 grad_norm: 4.8500 loss: 0.6510 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6510 2023/05/27 16:46:38 - mmengine - INFO - Epoch(train) [37][740/940] lr: 1.0000e-03 eta: 2:03:02 time: 0.5954 data_time: 0.0075 memory: 15293 grad_norm: 6.1596 loss: 0.7197 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7197 2023/05/27 16:46:50 - mmengine - INFO - Epoch(train) [37][760/940] lr: 1.0000e-03 eta: 2:02:50 time: 0.5925 data_time: 0.0075 memory: 15293 grad_norm: 4.7918 loss: 0.6116 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6116 2023/05/27 16:47:02 - mmengine - INFO - Epoch(train) [37][780/940] lr: 1.0000e-03 eta: 2:02:38 time: 0.5910 data_time: 0.0076 memory: 15293 grad_norm: 4.5763 loss: 0.7757 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7757 2023/05/27 16:47:14 - mmengine - INFO - Epoch(train) [37][800/940] lr: 1.0000e-03 eta: 2:02:26 time: 0.5885 data_time: 0.0081 memory: 15293 grad_norm: 4.8303 loss: 0.6793 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6793 2023/05/27 16:47:26 - mmengine - INFO - Epoch(train) [37][820/940] lr: 1.0000e-03 eta: 2:02:14 time: 0.5903 data_time: 0.0085 memory: 15293 grad_norm: 4.2871 loss: 0.5051 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5051 2023/05/27 16:47:37 - mmengine - INFO - Epoch(train) [37][840/940] lr: 1.0000e-03 eta: 2:02:02 time: 0.5931 data_time: 0.0078 memory: 15293 grad_norm: 4.9908 loss: 0.6306 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6306 2023/05/27 16:47:49 - mmengine - INFO - Epoch(train) [37][860/940] lr: 1.0000e-03 eta: 2:01:50 time: 0.5893 data_time: 0.0075 memory: 15293 grad_norm: 4.7294 loss: 0.4925 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4925 2023/05/27 16:48:01 - mmengine - INFO - Epoch(train) [37][880/940] lr: 1.0000e-03 eta: 2:01:38 time: 0.5861 data_time: 0.0077 memory: 15293 grad_norm: 4.8140 loss: 0.7369 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7369 2023/05/27 16:48:13 - mmengine - INFO - Epoch(train) [37][900/940] lr: 1.0000e-03 eta: 2:01:26 time: 0.5901 data_time: 0.0078 memory: 15293 grad_norm: 4.6885 loss: 0.6024 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6024 2023/05/27 16:48:25 - mmengine - INFO - Epoch(train) [37][920/940] lr: 1.0000e-03 eta: 2:01:14 time: 0.5909 data_time: 0.0073 memory: 15293 grad_norm: 4.5238 loss: 0.7218 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7218 2023/05/27 16:48:36 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:48:36 - mmengine - INFO - Epoch(train) [37][940/940] lr: 1.0000e-03 eta: 2:01:02 time: 0.5670 data_time: 0.0073 memory: 15293 grad_norm: 4.6608 loss: 0.5000 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5000 2023/05/27 16:48:42 - mmengine - INFO - Epoch(val) [37][20/78] eta: 0:00:17 time: 0.2991 data_time: 0.1144 memory: 2851 2023/05/27 16:48:46 - mmengine - INFO - Epoch(val) [37][40/78] eta: 0:00:09 time: 0.2165 data_time: 0.0312 memory: 2851 2023/05/27 16:48:51 - mmengine - INFO - Epoch(val) [37][60/78] eta: 0:00:04 time: 0.2305 data_time: 0.0455 memory: 2851 2023/05/27 16:48:56 - mmengine - INFO - Epoch(val) [37][78/78] acc/top1: 0.7706 acc/top5: 0.9304 acc/mean1: 0.7704 data_time: 0.0495 time: 0.2315 2023/05/27 16:49:10 - mmengine - INFO - Epoch(train) [38][ 20/940] lr: 1.0000e-03 eta: 2:00:52 time: 0.6954 data_time: 0.0880 memory: 15293 grad_norm: 4.8131 loss: 0.5697 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5697 2023/05/27 16:49:22 - mmengine - INFO - Epoch(train) [38][ 40/940] lr: 1.0000e-03 eta: 2:00:40 time: 0.5860 data_time: 0.0078 memory: 15293 grad_norm: 4.4820 loss: 0.5320 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5320 2023/05/27 16:49:34 - mmengine - INFO - Epoch(train) [38][ 60/940] lr: 1.0000e-03 eta: 2:00:28 time: 0.5898 data_time: 0.0075 memory: 15293 grad_norm: 4.4287 loss: 0.5989 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5989 2023/05/27 16:49:46 - mmengine - INFO - Epoch(train) [38][ 80/940] lr: 1.0000e-03 eta: 2:00:16 time: 0.6000 data_time: 0.0074 memory: 15293 grad_norm: 4.8325 loss: 0.7077 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7077 2023/05/27 16:49:58 - mmengine - INFO - Epoch(train) [38][100/940] lr: 1.0000e-03 eta: 2:00:04 time: 0.5954 data_time: 0.0074 memory: 15293 grad_norm: 4.3440 loss: 0.6232 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6232 2023/05/27 16:50:09 - mmengine - INFO - Epoch(train) [38][120/940] lr: 1.0000e-03 eta: 1:59:52 time: 0.5872 data_time: 0.0076 memory: 15293 grad_norm: 4.5772 loss: 0.7626 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7626 2023/05/27 16:50:21 - mmengine - INFO - Epoch(train) [38][140/940] lr: 1.0000e-03 eta: 1:59:40 time: 0.5906 data_time: 0.0078 memory: 15293 grad_norm: 4.7020 loss: 0.7204 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7204 2023/05/27 16:50:33 - mmengine - INFO - Epoch(train) [38][160/940] lr: 1.0000e-03 eta: 1:59:28 time: 0.5881 data_time: 0.0078 memory: 15293 grad_norm: 4.8385 loss: 0.6612 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6612 2023/05/27 16:50:45 - mmengine - INFO - Epoch(train) [38][180/940] lr: 1.0000e-03 eta: 1:59:16 time: 0.5917 data_time: 0.0075 memory: 15293 grad_norm: 4.4408 loss: 0.6995 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6995 2023/05/27 16:50:57 - mmengine - INFO - Epoch(train) [38][200/940] lr: 1.0000e-03 eta: 1:59:04 time: 0.5887 data_time: 0.0074 memory: 15293 grad_norm: 4.6061 loss: 0.6500 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6500 2023/05/27 16:51:08 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:51:08 - mmengine - INFO - Epoch(train) [38][220/940] lr: 1.0000e-03 eta: 1:58:52 time: 0.5871 data_time: 0.0076 memory: 15293 grad_norm: 4.7397 loss: 0.7605 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7605 2023/05/27 16:51:20 - mmengine - INFO - Epoch(train) [38][240/940] lr: 1.0000e-03 eta: 1:58:40 time: 0.5893 data_time: 0.0074 memory: 15293 grad_norm: 5.0554 loss: 0.7464 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7464 2023/05/27 16:51:32 - mmengine - INFO - Epoch(train) [38][260/940] lr: 1.0000e-03 eta: 1:58:28 time: 0.6007 data_time: 0.0075 memory: 15293 grad_norm: 4.6058 loss: 0.6254 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6254 2023/05/27 16:51:44 - mmengine - INFO - Epoch(train) [38][280/940] lr: 1.0000e-03 eta: 1:58:16 time: 0.5895 data_time: 0.0076 memory: 15293 grad_norm: 4.6677 loss: 0.6376 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6376 2023/05/27 16:51:56 - mmengine - INFO - Epoch(train) [38][300/940] lr: 1.0000e-03 eta: 1:58:04 time: 0.5886 data_time: 0.0073 memory: 15293 grad_norm: 4.3643 loss: 0.5771 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5771 2023/05/27 16:52:07 - mmengine - INFO - Epoch(train) [38][320/940] lr: 1.0000e-03 eta: 1:57:53 time: 0.5940 data_time: 0.0074 memory: 15293 grad_norm: 4.8861 loss: 0.5749 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5749 2023/05/27 16:52:19 - mmengine - INFO - Epoch(train) [38][340/940] lr: 1.0000e-03 eta: 1:57:41 time: 0.5903 data_time: 0.0076 memory: 15293 grad_norm: 4.4832 loss: 0.7802 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7802 2023/05/27 16:52:31 - mmengine - INFO - Epoch(train) [38][360/940] lr: 1.0000e-03 eta: 1:57:29 time: 0.5903 data_time: 0.0075 memory: 15293 grad_norm: 4.8585 loss: 0.6038 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6038 2023/05/27 16:52:43 - mmengine - INFO - Epoch(train) [38][380/940] lr: 1.0000e-03 eta: 1:57:17 time: 0.5871 data_time: 0.0078 memory: 15293 grad_norm: 4.5099 loss: 0.6986 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.6986 2023/05/27 16:52:55 - mmengine - INFO - Epoch(train) [38][400/940] lr: 1.0000e-03 eta: 1:57:05 time: 0.5943 data_time: 0.0075 memory: 15293 grad_norm: 4.6375 loss: 0.6724 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6724 2023/05/27 16:53:06 - mmengine - INFO - Epoch(train) [38][420/940] lr: 1.0000e-03 eta: 1:56:53 time: 0.5867 data_time: 0.0075 memory: 15293 grad_norm: 4.4962 loss: 0.8670 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.8670 2023/05/27 16:53:18 - mmengine - INFO - Epoch(train) [38][440/940] lr: 1.0000e-03 eta: 1:56:41 time: 0.5943 data_time: 0.0074 memory: 15293 grad_norm: 4.5723 loss: 0.7042 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7042 2023/05/27 16:53:30 - mmengine - INFO - Epoch(train) [38][460/940] lr: 1.0000e-03 eta: 1:56:29 time: 0.5904 data_time: 0.0075 memory: 15293 grad_norm: 5.5721 loss: 0.6443 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6443 2023/05/27 16:53:42 - mmengine - INFO - Epoch(train) [38][480/940] lr: 1.0000e-03 eta: 1:56:17 time: 0.5914 data_time: 0.0074 memory: 15293 grad_norm: 4.6547 loss: 0.5844 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5844 2023/05/27 16:53:54 - mmengine - INFO - Epoch(train) [38][500/940] lr: 1.0000e-03 eta: 1:56:05 time: 0.5943 data_time: 0.0074 memory: 15293 grad_norm: 4.4049 loss: 0.5985 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5985 2023/05/27 16:54:06 - mmengine - INFO - Epoch(train) [38][520/940] lr: 1.0000e-03 eta: 1:55:53 time: 0.5961 data_time: 0.0082 memory: 15293 grad_norm: 4.8584 loss: 0.4937 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4937 2023/05/27 16:54:18 - mmengine - INFO - Epoch(train) [38][540/940] lr: 1.0000e-03 eta: 1:55:41 time: 0.5902 data_time: 0.0076 memory: 15293 grad_norm: 4.4989 loss: 0.7681 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7681 2023/05/27 16:54:29 - mmengine - INFO - Epoch(train) [38][560/940] lr: 1.0000e-03 eta: 1:55:29 time: 0.5879 data_time: 0.0074 memory: 15293 grad_norm: 4.6440 loss: 0.6350 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6350 2023/05/27 16:54:41 - mmengine - INFO - Epoch(train) [38][580/940] lr: 1.0000e-03 eta: 1:55:18 time: 0.6012 data_time: 0.0075 memory: 15293 grad_norm: 4.4545 loss: 0.5959 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5959 2023/05/27 16:54:53 - mmengine - INFO - Epoch(train) [38][600/940] lr: 1.0000e-03 eta: 1:55:06 time: 0.5917 data_time: 0.0079 memory: 15293 grad_norm: 4.3402 loss: 0.6609 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6609 2023/05/27 16:55:05 - mmengine - INFO - Epoch(train) [38][620/940] lr: 1.0000e-03 eta: 1:54:54 time: 0.5988 data_time: 0.0080 memory: 15293 grad_norm: 4.7768 loss: 0.5035 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5035 2023/05/27 16:55:17 - mmengine - INFO - Epoch(train) [38][640/940] lr: 1.0000e-03 eta: 1:54:42 time: 0.5953 data_time: 0.0076 memory: 15293 grad_norm: 4.5521 loss: 0.6146 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6146 2023/05/27 16:55:29 - mmengine - INFO - Epoch(train) [38][660/940] lr: 1.0000e-03 eta: 1:54:30 time: 0.5895 data_time: 0.0075 memory: 15293 grad_norm: 4.9480 loss: 0.5459 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5459 2023/05/27 16:55:41 - mmengine - INFO - Epoch(train) [38][680/940] lr: 1.0000e-03 eta: 1:54:18 time: 0.5908 data_time: 0.0076 memory: 15293 grad_norm: 5.0514 loss: 0.4920 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4920 2023/05/27 16:55:53 - mmengine - INFO - Epoch(train) [38][700/940] lr: 1.0000e-03 eta: 1:54:06 time: 0.6057 data_time: 0.0073 memory: 15293 grad_norm: 4.5774 loss: 0.5410 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5410 2023/05/27 16:56:05 - mmengine - INFO - Epoch(train) [38][720/940] lr: 1.0000e-03 eta: 1:53:54 time: 0.5886 data_time: 0.0075 memory: 15293 grad_norm: 4.5559 loss: 0.6104 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6104 2023/05/27 16:56:17 - mmengine - INFO - Epoch(train) [38][740/940] lr: 1.0000e-03 eta: 1:53:42 time: 0.5957 data_time: 0.0077 memory: 15293 grad_norm: 4.7589 loss: 0.5726 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5726 2023/05/27 16:56:28 - mmengine - INFO - Epoch(train) [38][760/940] lr: 1.0000e-03 eta: 1:53:31 time: 0.5962 data_time: 0.0076 memory: 15293 grad_norm: 4.4021 loss: 0.6405 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6405 2023/05/27 16:56:40 - mmengine - INFO - Epoch(train) [38][780/940] lr: 1.0000e-03 eta: 1:53:19 time: 0.5956 data_time: 0.0075 memory: 15293 grad_norm: 5.1323 loss: 0.6067 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6067 2023/05/27 16:56:52 - mmengine - INFO - Epoch(train) [38][800/940] lr: 1.0000e-03 eta: 1:53:07 time: 0.5895 data_time: 0.0073 memory: 15293 grad_norm: 4.6033 loss: 0.4898 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4898 2023/05/27 16:57:04 - mmengine - INFO - Epoch(train) [38][820/940] lr: 1.0000e-03 eta: 1:52:55 time: 0.5894 data_time: 0.0074 memory: 15293 grad_norm: 4.3772 loss: 0.6058 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6058 2023/05/27 16:57:16 - mmengine - INFO - Epoch(train) [38][840/940] lr: 1.0000e-03 eta: 1:52:43 time: 0.5989 data_time: 0.0074 memory: 15293 grad_norm: 4.4849 loss: 0.5359 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5359 2023/05/27 16:57:28 - mmengine - INFO - Epoch(train) [38][860/940] lr: 1.0000e-03 eta: 1:52:31 time: 0.5885 data_time: 0.0077 memory: 15293 grad_norm: 4.2815 loss: 0.6092 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6092 2023/05/27 16:57:39 - mmengine - INFO - Epoch(train) [38][880/940] lr: 1.0000e-03 eta: 1:52:19 time: 0.5870 data_time: 0.0076 memory: 15293 grad_norm: 5.6604 loss: 0.5131 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5131 2023/05/27 16:57:51 - mmengine - INFO - Epoch(train) [38][900/940] lr: 1.0000e-03 eta: 1:52:07 time: 0.5890 data_time: 0.0086 memory: 15293 grad_norm: 5.6857 loss: 0.5388 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5388 2023/05/27 16:58:03 - mmengine - INFO - Epoch(train) [38][920/940] lr: 1.0000e-03 eta: 1:51:55 time: 0.5919 data_time: 0.0078 memory: 15293 grad_norm: 6.5913 loss: 0.4810 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4810 2023/05/27 16:58:15 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 16:58:15 - mmengine - INFO - Epoch(train) [38][940/940] lr: 1.0000e-03 eta: 1:51:43 time: 0.5744 data_time: 0.0075 memory: 15293 grad_norm: 4.8746 loss: 0.6276 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6276 2023/05/27 16:58:20 - mmengine - INFO - Epoch(val) [38][20/78] eta: 0:00:16 time: 0.2889 data_time: 0.1040 memory: 2851 2023/05/27 16:58:25 - mmengine - INFO - Epoch(val) [38][40/78] eta: 0:00:09 time: 0.2268 data_time: 0.0418 memory: 2851 2023/05/27 16:58:29 - mmengine - INFO - Epoch(val) [38][60/78] eta: 0:00:04 time: 0.2295 data_time: 0.0443 memory: 2851 2023/05/27 16:59:09 - mmengine - INFO - Epoch(val) [38][78/78] acc/top1: 0.7690 acc/top5: 0.9290 acc/mean1: 0.7689 data_time: 0.0494 time: 0.2314 2023/05/27 16:59:23 - mmengine - INFO - Epoch(train) [39][ 20/940] lr: 1.0000e-03 eta: 1:51:32 time: 0.6798 data_time: 0.0750 memory: 15293 grad_norm: 4.9963 loss: 0.6216 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6216 2023/05/27 16:59:34 - mmengine - INFO - Epoch(train) [39][ 40/940] lr: 1.0000e-03 eta: 1:51:20 time: 0.5856 data_time: 0.0077 memory: 15293 grad_norm: 6.0514 loss: 0.7069 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7069 2023/05/27 16:59:46 - mmengine - INFO - Epoch(train) [39][ 60/940] lr: 1.0000e-03 eta: 1:51:08 time: 0.5940 data_time: 0.0076 memory: 15293 grad_norm: 4.4795 loss: 0.5635 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5635 2023/05/27 16:59:58 - mmengine - INFO - Epoch(train) [39][ 80/940] lr: 1.0000e-03 eta: 1:50:56 time: 0.5904 data_time: 0.0078 memory: 15293 grad_norm: 4.2307 loss: 0.6698 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6698 2023/05/27 17:00:10 - mmengine - INFO - Epoch(train) [39][100/940] lr: 1.0000e-03 eta: 1:50:45 time: 0.5928 data_time: 0.0076 memory: 15293 grad_norm: 4.4766 loss: 0.8545 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8545 2023/05/27 17:00:22 - mmengine - INFO - Epoch(train) [39][120/940] lr: 1.0000e-03 eta: 1:50:33 time: 0.5896 data_time: 0.0076 memory: 15293 grad_norm: 4.6914 loss: 0.5485 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5485 2023/05/27 17:00:34 - mmengine - INFO - Epoch(train) [39][140/940] lr: 1.0000e-03 eta: 1:50:21 time: 0.5956 data_time: 0.0078 memory: 15293 grad_norm: 4.7482 loss: 0.5113 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5113 2023/05/27 17:00:45 - mmengine - INFO - Epoch(train) [39][160/940] lr: 1.0000e-03 eta: 1:50:09 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.7469 loss: 0.7481 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7481 2023/05/27 17:00:57 - mmengine - INFO - Epoch(train) [39][180/940] lr: 1.0000e-03 eta: 1:49:57 time: 0.5937 data_time: 0.0075 memory: 15293 grad_norm: 4.5634 loss: 0.5757 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5757 2023/05/27 17:01:09 - mmengine - INFO - Epoch(train) [39][200/940] lr: 1.0000e-03 eta: 1:49:45 time: 0.5918 data_time: 0.0077 memory: 15293 grad_norm: 4.8423 loss: 0.6014 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6014 2023/05/27 17:01:21 - mmengine - INFO - Epoch(train) [39][220/940] lr: 1.0000e-03 eta: 1:49:33 time: 0.5908 data_time: 0.0076 memory: 15293 grad_norm: 4.9729 loss: 0.5652 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5652 2023/05/27 17:01:33 - mmengine - INFO - Epoch(train) [39][240/940] lr: 1.0000e-03 eta: 1:49:21 time: 0.5888 data_time: 0.0095 memory: 15293 grad_norm: 4.6012 loss: 0.7200 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7200 2023/05/27 17:01:44 - mmengine - INFO - Epoch(train) [39][260/940] lr: 1.0000e-03 eta: 1:49:09 time: 0.5873 data_time: 0.0078 memory: 15293 grad_norm: 4.6487 loss: 0.4501 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4501 2023/05/27 17:01:56 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:01:56 - mmengine - INFO - Epoch(train) [39][280/940] lr: 1.0000e-03 eta: 1:48:57 time: 0.5935 data_time: 0.0078 memory: 15293 grad_norm: 4.4888 loss: 0.5565 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5565 2023/05/27 17:02:08 - mmengine - INFO - Epoch(train) [39][300/940] lr: 1.0000e-03 eta: 1:48:45 time: 0.5999 data_time: 0.0075 memory: 15293 grad_norm: 5.5727 loss: 0.7311 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7311 2023/05/27 17:02:20 - mmengine - INFO - Epoch(train) [39][320/940] lr: 1.0000e-03 eta: 1:48:33 time: 0.5918 data_time: 0.0077 memory: 15293 grad_norm: 4.4992 loss: 0.7823 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7823 2023/05/27 17:02:32 - mmengine - INFO - Epoch(train) [39][340/940] lr: 1.0000e-03 eta: 1:48:21 time: 0.5908 data_time: 0.0076 memory: 15293 grad_norm: 4.5448 loss: 0.4979 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4979 2023/05/27 17:02:44 - mmengine - INFO - Epoch(train) [39][360/940] lr: 1.0000e-03 eta: 1:48:10 time: 0.5924 data_time: 0.0075 memory: 15293 grad_norm: 5.3798 loss: 0.4549 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4549 2023/05/27 17:02:56 - mmengine - INFO - Epoch(train) [39][380/940] lr: 1.0000e-03 eta: 1:47:58 time: 0.5977 data_time: 0.0075 memory: 15293 grad_norm: 5.5584 loss: 0.8134 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8134 2023/05/27 17:03:08 - mmengine - INFO - Epoch(train) [39][400/940] lr: 1.0000e-03 eta: 1:47:46 time: 0.5906 data_time: 0.0073 memory: 15293 grad_norm: 4.4326 loss: 0.6583 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6583 2023/05/27 17:03:19 - mmengine - INFO - Epoch(train) [39][420/940] lr: 1.0000e-03 eta: 1:47:34 time: 0.5901 data_time: 0.0076 memory: 15293 grad_norm: 4.6464 loss: 0.5923 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5923 2023/05/27 17:03:31 - mmengine - INFO - Epoch(train) [39][440/940] lr: 1.0000e-03 eta: 1:47:22 time: 0.5900 data_time: 0.0076 memory: 15293 grad_norm: 5.1902 loss: 0.7138 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7138 2023/05/27 17:03:43 - mmengine - INFO - Epoch(train) [39][460/940] lr: 1.0000e-03 eta: 1:47:10 time: 0.5922 data_time: 0.0078 memory: 15293 grad_norm: 5.3923 loss: 0.3902 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.3902 2023/05/27 17:03:55 - mmengine - INFO - Epoch(train) [39][480/940] lr: 1.0000e-03 eta: 1:46:58 time: 0.5910 data_time: 0.0076 memory: 15293 grad_norm: 4.7041 loss: 0.6745 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.6745 2023/05/27 17:04:07 - mmengine - INFO - Epoch(train) [39][500/940] lr: 1.0000e-03 eta: 1:46:46 time: 0.5895 data_time: 0.0076 memory: 15293 grad_norm: 4.9558 loss: 0.5718 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.5718 2023/05/27 17:04:19 - mmengine - INFO - Epoch(train) [39][520/940] lr: 1.0000e-03 eta: 1:46:34 time: 0.5938 data_time: 0.0075 memory: 15293 grad_norm: 4.7860 loss: 0.8167 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8167 2023/05/27 17:04:30 - mmengine - INFO - Epoch(train) [39][540/940] lr: 1.0000e-03 eta: 1:46:22 time: 0.5865 data_time: 0.0076 memory: 15293 grad_norm: 4.5882 loss: 0.6885 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6885 2023/05/27 17:04:42 - mmengine - INFO - Epoch(train) [39][560/940] lr: 1.0000e-03 eta: 1:46:10 time: 0.5961 data_time: 0.0078 memory: 15293 grad_norm: 4.8419 loss: 0.7416 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7416 2023/05/27 17:04:54 - mmengine - INFO - Epoch(train) [39][580/940] lr: 1.0000e-03 eta: 1:45:58 time: 0.5929 data_time: 0.0078 memory: 15293 grad_norm: 4.6394 loss: 0.6174 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6174 2023/05/27 17:05:06 - mmengine - INFO - Epoch(train) [39][600/940] lr: 1.0000e-03 eta: 1:45:46 time: 0.5868 data_time: 0.0076 memory: 15293 grad_norm: 4.7082 loss: 0.5985 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5985 2023/05/27 17:05:18 - mmengine - INFO - Epoch(train) [39][620/940] lr: 1.0000e-03 eta: 1:45:35 time: 0.5906 data_time: 0.0077 memory: 15293 grad_norm: 4.6649 loss: 0.6104 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6104 2023/05/27 17:05:30 - mmengine - INFO - Epoch(train) [39][640/940] lr: 1.0000e-03 eta: 1:45:23 time: 0.5964 data_time: 0.0075 memory: 15293 grad_norm: 4.3249 loss: 0.4609 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4609 2023/05/27 17:05:41 - mmengine - INFO - Epoch(train) [39][660/940] lr: 1.0000e-03 eta: 1:45:11 time: 0.5867 data_time: 0.0073 memory: 15293 grad_norm: 4.5473 loss: 0.6830 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6830 2023/05/27 17:05:53 - mmengine - INFO - Epoch(train) [39][680/940] lr: 1.0000e-03 eta: 1:44:59 time: 0.5920 data_time: 0.0075 memory: 15293 grad_norm: 4.4039 loss: 0.6991 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6991 2023/05/27 17:06:05 - mmengine - INFO - Epoch(train) [39][700/940] lr: 1.0000e-03 eta: 1:44:47 time: 0.5982 data_time: 0.0076 memory: 15293 grad_norm: 5.0083 loss: 0.6005 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6005 2023/05/27 17:06:17 - mmengine - INFO - Epoch(train) [39][720/940] lr: 1.0000e-03 eta: 1:44:35 time: 0.5853 data_time: 0.0076 memory: 15293 grad_norm: 4.6045 loss: 0.8421 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.8421 2023/05/27 17:06:29 - mmengine - INFO - Epoch(train) [39][740/940] lr: 1.0000e-03 eta: 1:44:23 time: 0.5897 data_time: 0.0076 memory: 15293 grad_norm: 4.7185 loss: 0.5516 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5516 2023/05/27 17:06:41 - mmengine - INFO - Epoch(train) [39][760/940] lr: 1.0000e-03 eta: 1:44:11 time: 0.5974 data_time: 0.0075 memory: 15293 grad_norm: 5.0899 loss: 0.5214 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5214 2023/05/27 17:06:52 - mmengine - INFO - Epoch(train) [39][780/940] lr: 1.0000e-03 eta: 1:43:59 time: 0.5907 data_time: 0.0078 memory: 15293 grad_norm: 4.3378 loss: 0.5053 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5053 2023/05/27 17:07:04 - mmengine - INFO - Epoch(train) [39][800/940] lr: 1.0000e-03 eta: 1:43:47 time: 0.5858 data_time: 0.0075 memory: 15293 grad_norm: 4.3808 loss: 0.6404 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6404 2023/05/27 17:07:16 - mmengine - INFO - Epoch(train) [39][820/940] lr: 1.0000e-03 eta: 1:43:35 time: 0.5994 data_time: 0.0075 memory: 15293 grad_norm: 4.8913 loss: 0.6232 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6232 2023/05/27 17:07:28 - mmengine - INFO - Epoch(train) [39][840/940] lr: 1.0000e-03 eta: 1:43:23 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.7702 loss: 0.6078 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6078 2023/05/27 17:07:40 - mmengine - INFO - Epoch(train) [39][860/940] lr: 1.0000e-03 eta: 1:43:12 time: 0.5918 data_time: 0.0076 memory: 15293 grad_norm: 5.2562 loss: 0.4562 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4562 2023/05/27 17:07:52 - mmengine - INFO - Epoch(train) [39][880/940] lr: 1.0000e-03 eta: 1:43:00 time: 0.5933 data_time: 0.0096 memory: 15293 grad_norm: 5.4018 loss: 0.6568 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6568 2023/05/27 17:08:03 - mmengine - INFO - Epoch(train) [39][900/940] lr: 1.0000e-03 eta: 1:42:48 time: 0.5903 data_time: 0.0078 memory: 15293 grad_norm: 5.5445 loss: 0.4903 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.4903 2023/05/27 17:08:15 - mmengine - INFO - Epoch(train) [39][920/940] lr: 1.0000e-03 eta: 1:42:36 time: 0.5854 data_time: 0.0075 memory: 15293 grad_norm: 4.4891 loss: 0.4045 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4045 2023/05/27 17:08:26 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:08:26 - mmengine - INFO - Epoch(train) [39][940/940] lr: 1.0000e-03 eta: 1:42:23 time: 0.5670 data_time: 0.0076 memory: 15293 grad_norm: 5.2779 loss: 0.5620 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5620 2023/05/27 17:08:26 - mmengine - INFO - Saving checkpoint at 39 epochs 2023/05/27 17:08:35 - mmengine - INFO - Epoch(val) [39][20/78] eta: 0:00:17 time: 0.3022 data_time: 0.1177 memory: 2851 2023/05/27 17:08:40 - mmengine - INFO - Epoch(val) [39][40/78] eta: 0:00:10 time: 0.2356 data_time: 0.0506 memory: 2851 2023/05/27 17:08:44 - mmengine - INFO - Epoch(val) [39][60/78] eta: 0:00:04 time: 0.2266 data_time: 0.0428 memory: 2851 2023/05/27 17:09:50 - mmengine - INFO - Epoch(val) [39][78/78] acc/top1: 0.7695 acc/top5: 0.9294 acc/mean1: 0.7694 data_time: 0.0544 time: 0.2358 2023/05/27 17:10:03 - mmengine - INFO - Epoch(train) [40][ 20/940] lr: 1.0000e-03 eta: 1:42:13 time: 0.6776 data_time: 0.0751 memory: 15293 grad_norm: 5.0906 loss: 0.5241 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5241 2023/05/27 17:10:15 - mmengine - INFO - Epoch(train) [40][ 40/940] lr: 1.0000e-03 eta: 1:42:01 time: 0.5870 data_time: 0.0077 memory: 15293 grad_norm: 4.4637 loss: 0.5104 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5104 2023/05/27 17:10:27 - mmengine - INFO - Epoch(train) [40][ 60/940] lr: 1.0000e-03 eta: 1:41:49 time: 0.5931 data_time: 0.0080 memory: 15293 grad_norm: 5.0233 loss: 0.6398 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6398 2023/05/27 17:10:39 - mmengine - INFO - Epoch(train) [40][ 80/940] lr: 1.0000e-03 eta: 1:41:37 time: 0.5934 data_time: 0.0078 memory: 15293 grad_norm: 4.6213 loss: 0.5585 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5585 2023/05/27 17:10:50 - mmengine - INFO - Epoch(train) [40][100/940] lr: 1.0000e-03 eta: 1:41:25 time: 0.5878 data_time: 0.0077 memory: 15293 grad_norm: 4.7032 loss: 0.5470 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5470 2023/05/27 17:11:02 - mmengine - INFO - Epoch(train) [40][120/940] lr: 1.0000e-03 eta: 1:41:13 time: 0.5893 data_time: 0.0076 memory: 15293 grad_norm: 6.1579 loss: 0.4728 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4728 2023/05/27 17:11:14 - mmengine - INFO - Epoch(train) [40][140/940] lr: 1.0000e-03 eta: 1:41:01 time: 0.5905 data_time: 0.0078 memory: 15293 grad_norm: 5.3551 loss: 0.7663 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.7663 2023/05/27 17:11:26 - mmengine - INFO - Epoch(train) [40][160/940] lr: 1.0000e-03 eta: 1:40:49 time: 0.5886 data_time: 0.0075 memory: 15293 grad_norm: 4.8322 loss: 0.4705 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.4705 2023/05/27 17:11:38 - mmengine - INFO - Epoch(train) [40][180/940] lr: 1.0000e-03 eta: 1:40:37 time: 0.5917 data_time: 0.0077 memory: 15293 grad_norm: 4.7787 loss: 0.6575 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6575 2023/05/27 17:11:50 - mmengine - INFO - Epoch(train) [40][200/940] lr: 1.0000e-03 eta: 1:40:25 time: 0.5959 data_time: 0.0077 memory: 15293 grad_norm: 4.4863 loss: 0.6597 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6597 2023/05/27 17:12:02 - mmengine - INFO - Epoch(train) [40][220/940] lr: 1.0000e-03 eta: 1:40:13 time: 0.5992 data_time: 0.0076 memory: 15293 grad_norm: 4.7332 loss: 0.5999 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5999 2023/05/27 17:12:13 - mmengine - INFO - Epoch(train) [40][240/940] lr: 1.0000e-03 eta: 1:40:02 time: 0.5945 data_time: 0.0076 memory: 15293 grad_norm: 5.0052 loss: 0.5501 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5501 2023/05/27 17:12:25 - mmengine - INFO - Epoch(train) [40][260/940] lr: 1.0000e-03 eta: 1:39:50 time: 0.5913 data_time: 0.0078 memory: 15293 grad_norm: 4.6034 loss: 0.6249 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6249 2023/05/27 17:12:37 - mmengine - INFO - Epoch(train) [40][280/940] lr: 1.0000e-03 eta: 1:39:38 time: 0.5976 data_time: 0.0078 memory: 15293 grad_norm: 4.4925 loss: 0.4220 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4220 2023/05/27 17:12:49 - mmengine - INFO - Epoch(train) [40][300/940] lr: 1.0000e-03 eta: 1:39:26 time: 0.5979 data_time: 0.0075 memory: 15293 grad_norm: 4.8678 loss: 0.5275 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5275 2023/05/27 17:13:01 - mmengine - INFO - Epoch(train) [40][320/940] lr: 1.0000e-03 eta: 1:39:14 time: 0.5980 data_time: 0.0075 memory: 15293 grad_norm: 4.9067 loss: 0.6932 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6932 2023/05/27 17:13:13 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:13:13 - mmengine - INFO - Epoch(train) [40][340/940] lr: 1.0000e-03 eta: 1:39:02 time: 0.6043 data_time: 0.0076 memory: 15293 grad_norm: 4.9767 loss: 0.5427 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5427 2023/05/27 17:13:25 - mmengine - INFO - Epoch(train) [40][360/940] lr: 1.0000e-03 eta: 1:38:51 time: 0.5990 data_time: 0.0075 memory: 15293 grad_norm: 5.2781 loss: 0.5510 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5510 2023/05/27 17:13:37 - mmengine - INFO - Epoch(train) [40][380/940] lr: 1.0000e-03 eta: 1:38:39 time: 0.6008 data_time: 0.0074 memory: 15293 grad_norm: 5.2684 loss: 0.5772 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5772 2023/05/27 17:13:49 - mmengine - INFO - Epoch(train) [40][400/940] lr: 1.0000e-03 eta: 1:38:27 time: 0.5906 data_time: 0.0087 memory: 15293 grad_norm: 4.5393 loss: 0.6246 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6246 2023/05/27 17:14:01 - mmengine - INFO - Epoch(train) [40][420/940] lr: 1.0000e-03 eta: 1:38:15 time: 0.6028 data_time: 0.0185 memory: 15293 grad_norm: 4.7135 loss: 0.4538 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4538 2023/05/27 17:14:13 - mmengine - INFO - Epoch(train) [40][440/940] lr: 1.0000e-03 eta: 1:38:03 time: 0.5924 data_time: 0.0082 memory: 15293 grad_norm: 4.1668 loss: 0.5562 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5562 2023/05/27 17:14:25 - mmengine - INFO - Epoch(train) [40][460/940] lr: 1.0000e-03 eta: 1:37:51 time: 0.5866 data_time: 0.0076 memory: 15293 grad_norm: 4.5299 loss: 0.6734 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6734 2023/05/27 17:14:37 - mmengine - INFO - Epoch(train) [40][480/940] lr: 1.0000e-03 eta: 1:37:39 time: 0.5933 data_time: 0.0087 memory: 15293 grad_norm: 4.4522 loss: 0.7135 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7135 2023/05/27 17:14:49 - mmengine - INFO - Epoch(train) [40][500/940] lr: 1.0000e-03 eta: 1:37:27 time: 0.5969 data_time: 0.0166 memory: 15293 grad_norm: 5.4335 loss: 0.7234 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7234 2023/05/27 17:15:00 - mmengine - INFO - Epoch(train) [40][520/940] lr: 1.0000e-03 eta: 1:37:16 time: 0.5967 data_time: 0.0075 memory: 15293 grad_norm: 4.8198 loss: 0.5924 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5924 2023/05/27 17:15:12 - mmengine - INFO - Epoch(train) [40][540/940] lr: 1.0000e-03 eta: 1:37:04 time: 0.5980 data_time: 0.0164 memory: 15293 grad_norm: 4.7760 loss: 0.6517 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6517 2023/05/27 17:15:24 - mmengine - INFO - Epoch(train) [40][560/940] lr: 1.0000e-03 eta: 1:36:52 time: 0.5854 data_time: 0.0078 memory: 15293 grad_norm: 4.8354 loss: 0.4825 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4825 2023/05/27 17:15:36 - mmengine - INFO - Epoch(train) [40][580/940] lr: 1.0000e-03 eta: 1:36:40 time: 0.6072 data_time: 0.0285 memory: 15293 grad_norm: 4.8501 loss: 0.5832 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.5832 2023/05/27 17:15:48 - mmengine - INFO - Epoch(train) [40][600/940] lr: 1.0000e-03 eta: 1:36:28 time: 0.5907 data_time: 0.0078 memory: 15293 grad_norm: 4.6527 loss: 0.6788 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6788 2023/05/27 17:16:00 - mmengine - INFO - Epoch(train) [40][620/940] lr: 1.0000e-03 eta: 1:36:16 time: 0.5996 data_time: 0.0209 memory: 15293 grad_norm: 5.2763 loss: 0.6116 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6116 2023/05/27 17:16:12 - mmengine - INFO - Epoch(train) [40][640/940] lr: 1.0000e-03 eta: 1:36:04 time: 0.5933 data_time: 0.0119 memory: 15293 grad_norm: 4.4023 loss: 0.3697 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3697 2023/05/27 17:16:24 - mmengine - INFO - Epoch(train) [40][660/940] lr: 1.0000e-03 eta: 1:35:52 time: 0.5864 data_time: 0.0079 memory: 15293 grad_norm: 4.7838 loss: 0.5282 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5282 2023/05/27 17:16:36 - mmengine - INFO - Epoch(train) [40][680/940] lr: 1.0000e-03 eta: 1:35:40 time: 0.5956 data_time: 0.0159 memory: 15293 grad_norm: 4.6657 loss: 0.6972 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6972 2023/05/27 17:16:47 - mmengine - INFO - Epoch(train) [40][700/940] lr: 1.0000e-03 eta: 1:35:28 time: 0.5871 data_time: 0.0079 memory: 15293 grad_norm: 4.8921 loss: 0.5592 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5592 2023/05/27 17:16:59 - mmengine - INFO - Epoch(train) [40][720/940] lr: 1.0000e-03 eta: 1:35:17 time: 0.5933 data_time: 0.0123 memory: 15293 grad_norm: 4.3967 loss: 0.5979 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5979 2023/05/27 17:17:11 - mmengine - INFO - Epoch(train) [40][740/940] lr: 1.0000e-03 eta: 1:35:05 time: 0.5897 data_time: 0.0077 memory: 15293 grad_norm: 5.3026 loss: 0.4733 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4733 2023/05/27 17:17:23 - mmengine - INFO - Epoch(train) [40][760/940] lr: 1.0000e-03 eta: 1:34:53 time: 0.5871 data_time: 0.0077 memory: 15293 grad_norm: 4.6791 loss: 0.7292 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7292 2023/05/27 17:17:35 - mmengine - INFO - Epoch(train) [40][780/940] lr: 1.0000e-03 eta: 1:34:41 time: 0.5917 data_time: 0.0075 memory: 15293 grad_norm: 4.6983 loss: 0.7232 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7232 2023/05/27 17:17:46 - mmengine - INFO - Epoch(train) [40][800/940] lr: 1.0000e-03 eta: 1:34:29 time: 0.5942 data_time: 0.0075 memory: 15293 grad_norm: 4.3900 loss: 0.8519 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8519 2023/05/27 17:17:58 - mmengine - INFO - Epoch(train) [40][820/940] lr: 1.0000e-03 eta: 1:34:17 time: 0.5897 data_time: 0.0075 memory: 15293 grad_norm: 4.7443 loss: 0.5223 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5223 2023/05/27 17:18:10 - mmengine - INFO - Epoch(train) [40][840/940] lr: 1.0000e-03 eta: 1:34:05 time: 0.5944 data_time: 0.0162 memory: 15293 grad_norm: 4.6017 loss: 0.4679 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4679 2023/05/27 17:18:22 - mmengine - INFO - Epoch(train) [40][860/940] lr: 1.0000e-03 eta: 1:33:53 time: 0.5991 data_time: 0.0191 memory: 15293 grad_norm: 4.5615 loss: 0.5143 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5143 2023/05/27 17:18:34 - mmengine - INFO - Epoch(train) [40][880/940] lr: 1.0000e-03 eta: 1:33:41 time: 0.5950 data_time: 0.0171 memory: 15293 grad_norm: 4.3369 loss: 0.6783 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6783 2023/05/27 17:18:46 - mmengine - INFO - Epoch(train) [40][900/940] lr: 1.0000e-03 eta: 1:33:29 time: 0.5859 data_time: 0.0077 memory: 15293 grad_norm: 5.3479 loss: 0.6122 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6122 2023/05/27 17:18:58 - mmengine - INFO - Epoch(train) [40][920/940] lr: 1.0000e-03 eta: 1:33:18 time: 0.5978 data_time: 0.0144 memory: 15293 grad_norm: 4.7777 loss: 0.7618 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7618 2023/05/27 17:19:09 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:19:09 - mmengine - INFO - Epoch(train) [40][940/940] lr: 1.0000e-03 eta: 1:33:05 time: 0.5759 data_time: 0.0167 memory: 15293 grad_norm: 4.9656 loss: 0.4455 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4455 2023/05/27 17:19:15 - mmengine - INFO - Epoch(val) [40][20/78] eta: 0:00:17 time: 0.2991 data_time: 0.1143 memory: 2851 2023/05/27 17:19:20 - mmengine - INFO - Epoch(val) [40][40/78] eta: 0:00:09 time: 0.2266 data_time: 0.0415 memory: 2851 2023/05/27 17:19:25 - mmengine - INFO - Epoch(val) [40][60/78] eta: 0:00:04 time: 0.2295 data_time: 0.0445 memory: 2851 2023/05/27 17:19:33 - mmengine - INFO - Epoch(val) [40][78/78] acc/top1: 0.7687 acc/top5: 0.9279 acc/mean1: 0.7686 data_time: 0.0517 time: 0.2336 2023/05/27 17:19:47 - mmengine - INFO - Epoch(train) [41][ 20/940] lr: 1.0000e-04 eta: 1:32:54 time: 0.6660 data_time: 0.0838 memory: 15293 grad_norm: 5.4457 loss: 0.5855 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5855 2023/05/27 17:19:58 - mmengine - INFO - Epoch(train) [41][ 40/940] lr: 1.0000e-04 eta: 1:32:42 time: 0.5905 data_time: 0.0075 memory: 15293 grad_norm: 4.9198 loss: 0.6904 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6904 2023/05/27 17:20:10 - mmengine - INFO - Epoch(train) [41][ 60/940] lr: 1.0000e-04 eta: 1:32:30 time: 0.5929 data_time: 0.0076 memory: 15293 grad_norm: 4.6570 loss: 0.4911 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4911 2023/05/27 17:20:22 - mmengine - INFO - Epoch(train) [41][ 80/940] lr: 1.0000e-04 eta: 1:32:19 time: 0.5950 data_time: 0.0077 memory: 15293 grad_norm: 4.5991 loss: 0.5406 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5406 2023/05/27 17:20:34 - mmengine - INFO - Epoch(train) [41][100/940] lr: 1.0000e-04 eta: 1:32:07 time: 0.5970 data_time: 0.0078 memory: 15293 grad_norm: 4.3518 loss: 0.6156 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6156 2023/05/27 17:20:46 - mmengine - INFO - Epoch(train) [41][120/940] lr: 1.0000e-04 eta: 1:31:55 time: 0.5985 data_time: 0.0080 memory: 15293 grad_norm: 4.7239 loss: 0.6050 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6050 2023/05/27 17:20:58 - mmengine - INFO - Epoch(train) [41][140/940] lr: 1.0000e-04 eta: 1:31:43 time: 0.5977 data_time: 0.0079 memory: 15293 grad_norm: 5.4427 loss: 0.4709 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4709 2023/05/27 17:21:10 - mmengine - INFO - Epoch(train) [41][160/940] lr: 1.0000e-04 eta: 1:31:31 time: 0.5888 data_time: 0.0078 memory: 15293 grad_norm: 4.8179 loss: 0.5290 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5290 2023/05/27 17:21:22 - mmengine - INFO - Epoch(train) [41][180/940] lr: 1.0000e-04 eta: 1:31:19 time: 0.5917 data_time: 0.0075 memory: 15293 grad_norm: 4.8578 loss: 0.5655 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5655 2023/05/27 17:21:33 - mmengine - INFO - Epoch(train) [41][200/940] lr: 1.0000e-04 eta: 1:31:07 time: 0.5904 data_time: 0.0076 memory: 15293 grad_norm: 4.4629 loss: 0.4094 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4094 2023/05/27 17:21:45 - mmengine - INFO - Epoch(train) [41][220/940] lr: 1.0000e-04 eta: 1:30:55 time: 0.5910 data_time: 0.0075 memory: 15293 grad_norm: 4.8105 loss: 0.6562 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6562 2023/05/27 17:21:57 - mmengine - INFO - Epoch(train) [41][240/940] lr: 1.0000e-04 eta: 1:30:43 time: 0.5922 data_time: 0.0075 memory: 15293 grad_norm: 4.7182 loss: 0.5415 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5415 2023/05/27 17:22:09 - mmengine - INFO - Epoch(train) [41][260/940] lr: 1.0000e-04 eta: 1:30:31 time: 0.5869 data_time: 0.0077 memory: 15293 grad_norm: 5.2570 loss: 0.5594 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5594 2023/05/27 17:22:21 - mmengine - INFO - Epoch(train) [41][280/940] lr: 1.0000e-04 eta: 1:30:20 time: 0.5930 data_time: 0.0075 memory: 15293 grad_norm: 4.8886 loss: 0.5132 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5132 2023/05/27 17:22:33 - mmengine - INFO - Epoch(train) [41][300/940] lr: 1.0000e-04 eta: 1:30:08 time: 0.5973 data_time: 0.0085 memory: 15293 grad_norm: 5.1581 loss: 0.6945 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6945 2023/05/27 17:22:44 - mmengine - INFO - Epoch(train) [41][320/940] lr: 1.0000e-04 eta: 1:29:56 time: 0.5905 data_time: 0.0076 memory: 15293 grad_norm: 4.5391 loss: 0.6896 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6896 2023/05/27 17:22:56 - mmengine - INFO - Epoch(train) [41][340/940] lr: 1.0000e-04 eta: 1:29:44 time: 0.5870 data_time: 0.0074 memory: 15293 grad_norm: 5.3789 loss: 0.6090 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6090 2023/05/27 17:23:08 - mmengine - INFO - Epoch(train) [41][360/940] lr: 1.0000e-04 eta: 1:29:32 time: 0.5944 data_time: 0.0073 memory: 15293 grad_norm: 4.9381 loss: 0.4145 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4145 2023/05/27 17:23:20 - mmengine - INFO - Epoch(train) [41][380/940] lr: 1.0000e-04 eta: 1:29:20 time: 0.5868 data_time: 0.0074 memory: 15293 grad_norm: 4.4359 loss: 0.7515 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7515 2023/05/27 17:23:32 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:23:32 - mmengine - INFO - Epoch(train) [41][400/940] lr: 1.0000e-04 eta: 1:29:08 time: 0.5870 data_time: 0.0075 memory: 15293 grad_norm: 4.7146 loss: 0.5089 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5089 2023/05/27 17:23:43 - mmengine - INFO - Epoch(train) [41][420/940] lr: 1.0000e-04 eta: 1:28:56 time: 0.5923 data_time: 0.0076 memory: 15293 grad_norm: 4.7261 loss: 0.5058 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5058 2023/05/27 17:23:55 - mmengine - INFO - Epoch(train) [41][440/940] lr: 1.0000e-04 eta: 1:28:44 time: 0.5863 data_time: 0.0076 memory: 15293 grad_norm: 4.6757 loss: 0.5972 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5972 2023/05/27 17:24:07 - mmengine - INFO - Epoch(train) [41][460/940] lr: 1.0000e-04 eta: 1:28:32 time: 0.5899 data_time: 0.0076 memory: 15293 grad_norm: 4.7151 loss: 0.5905 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5905 2023/05/27 17:24:19 - mmengine - INFO - Epoch(train) [41][480/940] lr: 1.0000e-04 eta: 1:28:20 time: 0.6005 data_time: 0.0076 memory: 15293 grad_norm: 4.1991 loss: 0.6088 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6088 2023/05/27 17:24:31 - mmengine - INFO - Epoch(train) [41][500/940] lr: 1.0000e-04 eta: 1:28:08 time: 0.5886 data_time: 0.0077 memory: 15293 grad_norm: 4.5127 loss: 0.6688 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6688 2023/05/27 17:24:42 - mmengine - INFO - Epoch(train) [41][520/940] lr: 1.0000e-04 eta: 1:27:57 time: 0.5863 data_time: 0.0075 memory: 15293 grad_norm: 4.8234 loss: 0.6858 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6858 2023/05/27 17:24:54 - mmengine - INFO - Epoch(train) [41][540/940] lr: 1.0000e-04 eta: 1:27:45 time: 0.5888 data_time: 0.0077 memory: 15293 grad_norm: 4.9210 loss: 0.5031 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5031 2023/05/27 17:25:06 - mmengine - INFO - Epoch(train) [41][560/940] lr: 1.0000e-04 eta: 1:27:33 time: 0.5880 data_time: 0.0075 memory: 15293 grad_norm: 4.5060 loss: 0.6669 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6669 2023/05/27 17:25:18 - mmengine - INFO - Epoch(train) [41][580/940] lr: 1.0000e-04 eta: 1:27:21 time: 0.5953 data_time: 0.0077 memory: 15293 grad_norm: 4.3992 loss: 0.5067 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.5067 2023/05/27 17:25:30 - mmengine - INFO - Epoch(train) [41][600/940] lr: 1.0000e-04 eta: 1:27:09 time: 0.5985 data_time: 0.0079 memory: 15293 grad_norm: 4.8722 loss: 0.7952 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7952 2023/05/27 17:25:42 - mmengine - INFO - Epoch(train) [41][620/940] lr: 1.0000e-04 eta: 1:26:57 time: 0.5953 data_time: 0.0077 memory: 15293 grad_norm: 4.7400 loss: 0.5695 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5695 2023/05/27 17:25:54 - mmengine - INFO - Epoch(train) [41][640/940] lr: 1.0000e-04 eta: 1:26:45 time: 0.5952 data_time: 0.0076 memory: 15293 grad_norm: 4.3606 loss: 0.4804 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4804 2023/05/27 17:26:06 - mmengine - INFO - Epoch(train) [41][660/940] lr: 1.0000e-04 eta: 1:26:33 time: 0.5932 data_time: 0.0074 memory: 15293 grad_norm: 4.7187 loss: 0.7638 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.7638 2023/05/27 17:26:17 - mmengine - INFO - Epoch(train) [41][680/940] lr: 1.0000e-04 eta: 1:26:21 time: 0.5906 data_time: 0.0078 memory: 15293 grad_norm: 4.3684 loss: 0.6098 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6098 2023/05/27 17:26:29 - mmengine - INFO - Epoch(train) [41][700/940] lr: 1.0000e-04 eta: 1:26:09 time: 0.5879 data_time: 0.0076 memory: 15293 grad_norm: 4.5292 loss: 0.4992 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4992 2023/05/27 17:26:41 - mmengine - INFO - Epoch(train) [41][720/940] lr: 1.0000e-04 eta: 1:25:58 time: 0.5960 data_time: 0.0075 memory: 15293 grad_norm: 4.4867 loss: 0.6107 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6107 2023/05/27 17:26:53 - mmengine - INFO - Epoch(train) [41][740/940] lr: 1.0000e-04 eta: 1:25:46 time: 0.5866 data_time: 0.0077 memory: 15293 grad_norm: 4.9562 loss: 0.5631 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5631 2023/05/27 17:27:05 - mmengine - INFO - Epoch(train) [41][760/940] lr: 1.0000e-04 eta: 1:25:34 time: 0.5903 data_time: 0.0077 memory: 15293 grad_norm: 5.3041 loss: 0.5372 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5372 2023/05/27 17:27:16 - mmengine - INFO - Epoch(train) [41][780/940] lr: 1.0000e-04 eta: 1:25:22 time: 0.5961 data_time: 0.0076 memory: 15293 grad_norm: 4.4445 loss: 0.4966 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4966 2023/05/27 17:27:28 - mmengine - INFO - Epoch(train) [41][800/940] lr: 1.0000e-04 eta: 1:25:10 time: 0.5984 data_time: 0.0077 memory: 15293 grad_norm: 4.5500 loss: 0.5030 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5030 2023/05/27 17:27:40 - mmengine - INFO - Epoch(train) [41][820/940] lr: 1.0000e-04 eta: 1:24:58 time: 0.5868 data_time: 0.0079 memory: 15293 grad_norm: 4.6245 loss: 0.4913 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4913 2023/05/27 17:27:52 - mmengine - INFO - Epoch(train) [41][840/940] lr: 1.0000e-04 eta: 1:24:46 time: 0.5897 data_time: 0.0075 memory: 15293 grad_norm: 4.9449 loss: 0.6003 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6003 2023/05/27 17:28:04 - mmengine - INFO - Epoch(train) [41][860/940] lr: 1.0000e-04 eta: 1:24:34 time: 0.5864 data_time: 0.0075 memory: 15293 grad_norm: 5.5485 loss: 0.6889 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6889 2023/05/27 17:28:16 - mmengine - INFO - Epoch(train) [41][880/940] lr: 1.0000e-04 eta: 1:24:22 time: 0.5933 data_time: 0.0073 memory: 15293 grad_norm: 4.7867 loss: 0.5639 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5639 2023/05/27 17:28:27 - mmengine - INFO - Epoch(train) [41][900/940] lr: 1.0000e-04 eta: 1:24:10 time: 0.5930 data_time: 0.0079 memory: 15293 grad_norm: 4.4838 loss: 0.7083 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.7083 2023/05/27 17:28:39 - mmengine - INFO - Epoch(train) [41][920/940] lr: 1.0000e-04 eta: 1:23:58 time: 0.5949 data_time: 0.0077 memory: 15293 grad_norm: 4.5012 loss: 0.5888 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5888 2023/05/27 17:28:51 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:28:51 - mmengine - INFO - Epoch(train) [41][940/940] lr: 1.0000e-04 eta: 1:23:46 time: 0.5730 data_time: 0.0086 memory: 15293 grad_norm: 5.2016 loss: 0.6436 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6436 2023/05/27 17:28:57 - mmengine - INFO - Epoch(val) [41][20/78] eta: 0:00:16 time: 0.2889 data_time: 0.1040 memory: 2851 2023/05/27 17:29:01 - mmengine - INFO - Epoch(val) [41][40/78] eta: 0:00:09 time: 0.2117 data_time: 0.0261 memory: 2851 2023/05/27 17:29:06 - mmengine - INFO - Epoch(val) [41][60/78] eta: 0:00:04 time: 0.2410 data_time: 0.0558 memory: 2851 2023/05/27 17:30:26 - mmengine - INFO - Epoch(val) [41][78/78] acc/top1: 0.7716 acc/top5: 0.9299 acc/mean1: 0.7715 data_time: 0.0481 time: 0.2302 2023/05/27 17:30:40 - mmengine - INFO - Epoch(train) [42][ 20/940] lr: 1.0000e-04 eta: 1:23:35 time: 0.6945 data_time: 0.0715 memory: 15293 grad_norm: 4.3130 loss: 0.6394 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6394 2023/05/27 17:30:52 - mmengine - INFO - Epoch(train) [42][ 40/940] lr: 1.0000e-04 eta: 1:23:23 time: 0.5876 data_time: 0.0075 memory: 15293 grad_norm: 5.5203 loss: 0.4311 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4311 2023/05/27 17:31:03 - mmengine - INFO - Epoch(train) [42][ 60/940] lr: 1.0000e-04 eta: 1:23:11 time: 0.5876 data_time: 0.0082 memory: 15293 grad_norm: 4.4889 loss: 0.6650 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6650 2023/05/27 17:31:15 - mmengine - INFO - Epoch(train) [42][ 80/940] lr: 1.0000e-04 eta: 1:23:00 time: 0.5857 data_time: 0.0081 memory: 15293 grad_norm: 4.8626 loss: 0.7473 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7473 2023/05/27 17:31:27 - mmengine - INFO - Epoch(train) [42][100/940] lr: 1.0000e-04 eta: 1:22:48 time: 0.5920 data_time: 0.0079 memory: 15293 grad_norm: 4.5668 loss: 0.4972 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4972 2023/05/27 17:31:39 - mmengine - INFO - Epoch(train) [42][120/940] lr: 1.0000e-04 eta: 1:22:36 time: 0.5849 data_time: 0.0078 memory: 15293 grad_norm: 4.4238 loss: 0.5447 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.5447 2023/05/27 17:31:50 - mmengine - INFO - Epoch(train) [42][140/940] lr: 1.0000e-04 eta: 1:22:24 time: 0.5952 data_time: 0.0077 memory: 15293 grad_norm: 4.5768 loss: 0.5408 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5408 2023/05/27 17:32:02 - mmengine - INFO - Epoch(train) [42][160/940] lr: 1.0000e-04 eta: 1:22:12 time: 0.5866 data_time: 0.0074 memory: 15293 grad_norm: 4.9193 loss: 0.6201 top1_acc: 0.2500 top5_acc: 0.7500 loss_cls: 0.6201 2023/05/27 17:32:14 - mmengine - INFO - Epoch(train) [42][180/940] lr: 1.0000e-04 eta: 1:22:00 time: 0.5884 data_time: 0.0075 memory: 15293 grad_norm: 4.7657 loss: 0.6958 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6958 2023/05/27 17:32:26 - mmengine - INFO - Epoch(train) [42][200/940] lr: 1.0000e-04 eta: 1:21:48 time: 0.5857 data_time: 0.0075 memory: 15293 grad_norm: 4.3490 loss: 0.5545 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5545 2023/05/27 17:32:38 - mmengine - INFO - Epoch(train) [42][220/940] lr: 1.0000e-04 eta: 1:21:36 time: 0.5939 data_time: 0.0074 memory: 15293 grad_norm: 4.9330 loss: 0.5720 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5720 2023/05/27 17:32:49 - mmengine - INFO - Epoch(train) [42][240/940] lr: 1.0000e-04 eta: 1:21:24 time: 0.5902 data_time: 0.0074 memory: 15293 grad_norm: 6.0428 loss: 0.7181 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7181 2023/05/27 17:33:01 - mmengine - INFO - Epoch(train) [42][260/940] lr: 1.0000e-04 eta: 1:21:12 time: 0.5899 data_time: 0.0076 memory: 15293 grad_norm: 5.9657 loss: 0.6527 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6527 2023/05/27 17:33:13 - mmengine - INFO - Epoch(train) [42][280/940] lr: 1.0000e-04 eta: 1:21:00 time: 0.5851 data_time: 0.0073 memory: 15293 grad_norm: 4.9541 loss: 0.4655 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4655 2023/05/27 17:33:25 - mmengine - INFO - Epoch(train) [42][300/940] lr: 1.0000e-04 eta: 1:20:48 time: 0.5899 data_time: 0.0076 memory: 15293 grad_norm: 4.5028 loss: 0.5057 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5057 2023/05/27 17:33:36 - mmengine - INFO - Epoch(train) [42][320/940] lr: 1.0000e-04 eta: 1:20:36 time: 0.5893 data_time: 0.0075 memory: 15293 grad_norm: 4.7966 loss: 0.6379 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6379 2023/05/27 17:33:48 - mmengine - INFO - Epoch(train) [42][340/940] lr: 1.0000e-04 eta: 1:20:25 time: 0.5994 data_time: 0.0077 memory: 15293 grad_norm: 5.6105 loss: 0.5819 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5819 2023/05/27 17:34:00 - mmengine - INFO - Epoch(train) [42][360/940] lr: 1.0000e-04 eta: 1:20:13 time: 0.5959 data_time: 0.0074 memory: 15293 grad_norm: 4.3789 loss: 0.5415 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5415 2023/05/27 17:34:12 - mmengine - INFO - Epoch(train) [42][380/940] lr: 1.0000e-04 eta: 1:20:01 time: 0.5861 data_time: 0.0075 memory: 15293 grad_norm: 4.5142 loss: 0.6184 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6184 2023/05/27 17:34:24 - mmengine - INFO - Epoch(train) [42][400/940] lr: 1.0000e-04 eta: 1:19:49 time: 0.5980 data_time: 0.0076 memory: 15293 grad_norm: 4.5271 loss: 0.4932 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4932 2023/05/27 17:34:36 - mmengine - INFO - Epoch(train) [42][420/940] lr: 1.0000e-04 eta: 1:19:37 time: 0.5877 data_time: 0.0076 memory: 15293 grad_norm: 4.4662 loss: 0.6860 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6860 2023/05/27 17:34:48 - mmengine - INFO - Epoch(train) [42][440/940] lr: 1.0000e-04 eta: 1:19:25 time: 0.5862 data_time: 0.0074 memory: 15293 grad_norm: 4.4108 loss: 0.6011 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6011 2023/05/27 17:34:59 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:34:59 - mmengine - INFO - Epoch(train) [42][460/940] lr: 1.0000e-04 eta: 1:19:13 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.9409 loss: 0.4986 top1_acc: 0.5000 top5_acc: 0.7500 loss_cls: 0.4986 2023/05/27 17:35:11 - mmengine - INFO - Epoch(train) [42][480/940] lr: 1.0000e-04 eta: 1:19:01 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.4979 loss: 0.6114 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6114 2023/05/27 17:35:23 - mmengine - INFO - Epoch(train) [42][500/940] lr: 1.0000e-04 eta: 1:18:49 time: 0.5922 data_time: 0.0075 memory: 15293 grad_norm: 4.5342 loss: 0.7431 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7431 2023/05/27 17:35:35 - mmengine - INFO - Epoch(train) [42][520/940] lr: 1.0000e-04 eta: 1:18:37 time: 0.5878 data_time: 0.0074 memory: 15293 grad_norm: 4.4548 loss: 0.4111 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4111 2023/05/27 17:35:47 - mmengine - INFO - Epoch(train) [42][540/940] lr: 1.0000e-04 eta: 1:18:25 time: 0.5926 data_time: 0.0076 memory: 15293 grad_norm: 4.3030 loss: 0.5191 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5191 2023/05/27 17:35:58 - mmengine - INFO - Epoch(train) [42][560/940] lr: 1.0000e-04 eta: 1:18:13 time: 0.5889 data_time: 0.0077 memory: 15293 grad_norm: 4.7944 loss: 0.7434 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7434 2023/05/27 17:36:10 - mmengine - INFO - Epoch(train) [42][580/940] lr: 1.0000e-04 eta: 1:18:02 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.4499 loss: 0.5726 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5726 2023/05/27 17:36:22 - mmengine - INFO - Epoch(train) [42][600/940] lr: 1.0000e-04 eta: 1:17:50 time: 0.5986 data_time: 0.0075 memory: 15293 grad_norm: 4.9068 loss: 0.4706 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4706 2023/05/27 17:36:34 - mmengine - INFO - Epoch(train) [42][620/940] lr: 1.0000e-04 eta: 1:17:38 time: 0.5876 data_time: 0.0076 memory: 15293 grad_norm: 5.0581 loss: 0.6323 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6323 2023/05/27 17:36:46 - mmengine - INFO - Epoch(train) [42][640/940] lr: 1.0000e-04 eta: 1:17:26 time: 0.5884 data_time: 0.0085 memory: 15293 grad_norm: 4.9040 loss: 0.5857 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5857 2023/05/27 17:36:57 - mmengine - INFO - Epoch(train) [42][660/940] lr: 1.0000e-04 eta: 1:17:14 time: 0.5926 data_time: 0.0077 memory: 15293 grad_norm: 4.6587 loss: 0.5421 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5421 2023/05/27 17:37:09 - mmengine - INFO - Epoch(train) [42][680/940] lr: 1.0000e-04 eta: 1:17:02 time: 0.5890 data_time: 0.0075 memory: 15293 grad_norm: 4.7610 loss: 0.4881 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4881 2023/05/27 17:37:21 - mmengine - INFO - Epoch(train) [42][700/940] lr: 1.0000e-04 eta: 1:16:50 time: 0.5863 data_time: 0.0075 memory: 15293 grad_norm: 4.3341 loss: 0.7944 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7944 2023/05/27 17:37:33 - mmengine - INFO - Epoch(train) [42][720/940] lr: 1.0000e-04 eta: 1:16:38 time: 0.5885 data_time: 0.0077 memory: 15293 grad_norm: 5.1617 loss: 0.6355 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6355 2023/05/27 17:37:44 - mmengine - INFO - Epoch(train) [42][740/940] lr: 1.0000e-04 eta: 1:16:26 time: 0.5865 data_time: 0.0076 memory: 15293 grad_norm: 4.5157 loss: 0.6273 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6273 2023/05/27 17:37:56 - mmengine - INFO - Epoch(train) [42][760/940] lr: 1.0000e-04 eta: 1:16:14 time: 0.5869 data_time: 0.0075 memory: 15293 grad_norm: 4.6380 loss: 0.7160 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7160 2023/05/27 17:38:08 - mmengine - INFO - Epoch(train) [42][780/940] lr: 1.0000e-04 eta: 1:16:02 time: 0.5964 data_time: 0.0078 memory: 15293 grad_norm: 4.4836 loss: 0.6547 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6547 2023/05/27 17:38:20 - mmengine - INFO - Epoch(train) [42][800/940] lr: 1.0000e-04 eta: 1:15:50 time: 0.5916 data_time: 0.0076 memory: 15293 grad_norm: 4.7164 loss: 0.4922 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4922 2023/05/27 17:38:32 - mmengine - INFO - Epoch(train) [42][820/940] lr: 1.0000e-04 eta: 1:15:39 time: 0.5908 data_time: 0.0078 memory: 15293 grad_norm: 4.7794 loss: 0.4912 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4912 2023/05/27 17:38:44 - mmengine - INFO - Epoch(train) [42][840/940] lr: 1.0000e-04 eta: 1:15:27 time: 0.5897 data_time: 0.0077 memory: 15293 grad_norm: 5.2570 loss: 0.5166 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5166 2023/05/27 17:38:55 - mmengine - INFO - Epoch(train) [42][860/940] lr: 1.0000e-04 eta: 1:15:15 time: 0.5867 data_time: 0.0077 memory: 15293 grad_norm: 4.4755 loss: 0.7414 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7414 2023/05/27 17:39:07 - mmengine - INFO - Epoch(train) [42][880/940] lr: 1.0000e-04 eta: 1:15:03 time: 0.5882 data_time: 0.0075 memory: 15293 grad_norm: 4.5594 loss: 0.7280 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7280 2023/05/27 17:39:19 - mmengine - INFO - Epoch(train) [42][900/940] lr: 1.0000e-04 eta: 1:14:51 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 4.4820 loss: 0.4671 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4671 2023/05/27 17:39:31 - mmengine - INFO - Epoch(train) [42][920/940] lr: 1.0000e-04 eta: 1:14:39 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 5.4511 loss: 0.5549 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5549 2023/05/27 17:39:42 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:39:42 - mmengine - INFO - Epoch(train) [42][940/940] lr: 1.0000e-04 eta: 1:14:27 time: 0.5789 data_time: 0.0076 memory: 15293 grad_norm: 5.0545 loss: 0.5327 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5327 2023/05/27 17:39:42 - mmengine - INFO - Saving checkpoint at 42 epochs 2023/05/27 17:39:51 - mmengine - INFO - Epoch(val) [42][20/78] eta: 0:00:17 time: 0.3071 data_time: 0.1229 memory: 2851 2023/05/27 17:39:55 - mmengine - INFO - Epoch(val) [42][40/78] eta: 0:00:10 time: 0.2220 data_time: 0.0372 memory: 2851 2023/05/27 17:40:00 - mmengine - INFO - Epoch(val) [42][60/78] eta: 0:00:04 time: 0.2294 data_time: 0.0457 memory: 2851 2023/05/27 17:40:04 - mmengine - INFO - Epoch(val) [42][78/78] acc/top1: 0.7719 acc/top5: 0.9296 acc/mean1: 0.7717 data_time: 0.0530 time: 0.2342 2023/05/27 17:40:04 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_36.pth is removed 2023/05/27 17:40:06 - mmengine - INFO - The best checkpoint with 0.7719 acc/top1 at 42 epoch is saved to best_acc_top1_epoch_42.pth. 2023/05/27 17:40:19 - mmengine - INFO - Epoch(train) [43][ 20/940] lr: 1.0000e-04 eta: 1:14:16 time: 0.6732 data_time: 0.0915 memory: 15293 grad_norm: 4.5057 loss: 0.7016 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7016 2023/05/27 17:40:31 - mmengine - INFO - Epoch(train) [43][ 40/940] lr: 1.0000e-04 eta: 1:14:04 time: 0.5983 data_time: 0.0076 memory: 15293 grad_norm: 4.4705 loss: 0.4796 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4796 2023/05/27 17:40:43 - mmengine - INFO - Epoch(train) [43][ 60/940] lr: 1.0000e-04 eta: 1:13:52 time: 0.5865 data_time: 0.0075 memory: 15293 grad_norm: 4.4214 loss: 0.5496 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.5496 2023/05/27 17:40:55 - mmengine - INFO - Epoch(train) [43][ 80/940] lr: 1.0000e-04 eta: 1:13:40 time: 0.5875 data_time: 0.0075 memory: 15293 grad_norm: 4.3328 loss: 0.7130 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7130 2023/05/27 17:41:06 - mmengine - INFO - Epoch(train) [43][100/940] lr: 1.0000e-04 eta: 1:13:28 time: 0.5903 data_time: 0.0075 memory: 15293 grad_norm: 4.6822 loss: 0.3747 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3747 2023/05/27 17:41:18 - mmengine - INFO - Epoch(train) [43][120/940] lr: 1.0000e-04 eta: 1:13:16 time: 0.5913 data_time: 0.0074 memory: 15293 grad_norm: 5.5432 loss: 0.4844 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4844 2023/05/27 17:41:30 - mmengine - INFO - Epoch(train) [43][140/940] lr: 1.0000e-04 eta: 1:13:04 time: 0.5907 data_time: 0.0076 memory: 15293 grad_norm: 4.6051 loss: 0.6811 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6811 2023/05/27 17:41:42 - mmengine - INFO - Epoch(train) [43][160/940] lr: 1.0000e-04 eta: 1:12:52 time: 0.5890 data_time: 0.0075 memory: 15293 grad_norm: 4.7259 loss: 0.4364 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4364 2023/05/27 17:41:54 - mmengine - INFO - Epoch(train) [43][180/940] lr: 1.0000e-04 eta: 1:12:40 time: 0.5871 data_time: 0.0075 memory: 15293 grad_norm: 4.6602 loss: 0.6076 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6076 2023/05/27 17:42:05 - mmengine - INFO - Epoch(train) [43][200/940] lr: 1.0000e-04 eta: 1:12:28 time: 0.5911 data_time: 0.0080 memory: 15293 grad_norm: 5.0090 loss: 0.6021 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6021 2023/05/27 17:42:17 - mmengine - INFO - Epoch(train) [43][220/940] lr: 1.0000e-04 eta: 1:12:16 time: 0.5885 data_time: 0.0079 memory: 15293 grad_norm: 4.5770 loss: 0.8473 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8473 2023/05/27 17:42:29 - mmengine - INFO - Epoch(train) [43][240/940] lr: 1.0000e-04 eta: 1:12:05 time: 0.5898 data_time: 0.0076 memory: 15293 grad_norm: 4.5337 loss: 0.3413 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3413 2023/05/27 17:42:41 - mmengine - INFO - Epoch(train) [43][260/940] lr: 1.0000e-04 eta: 1:11:53 time: 0.5936 data_time: 0.0081 memory: 15293 grad_norm: 4.6658 loss: 0.5800 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5800 2023/05/27 17:42:53 - mmengine - INFO - Epoch(train) [43][280/940] lr: 1.0000e-04 eta: 1:11:41 time: 0.5863 data_time: 0.0076 memory: 15293 grad_norm: 4.9927 loss: 0.5076 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5076 2023/05/27 17:43:04 - mmengine - INFO - Epoch(train) [43][300/940] lr: 1.0000e-04 eta: 1:11:29 time: 0.5908 data_time: 0.0077 memory: 15293 grad_norm: 4.6700 loss: 0.4293 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4293 2023/05/27 17:43:16 - mmengine - INFO - Epoch(train) [43][320/940] lr: 1.0000e-04 eta: 1:11:17 time: 0.5912 data_time: 0.0077 memory: 15293 grad_norm: 5.4836 loss: 0.6978 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6978 2023/05/27 17:43:28 - mmengine - INFO - Epoch(train) [43][340/940] lr: 1.0000e-04 eta: 1:11:05 time: 0.5966 data_time: 0.0091 memory: 15293 grad_norm: 4.5941 loss: 0.6809 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6809 2023/05/27 17:43:40 - mmengine - INFO - Epoch(train) [43][360/940] lr: 1.0000e-04 eta: 1:10:53 time: 0.5922 data_time: 0.0076 memory: 15293 grad_norm: 4.3943 loss: 0.5688 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5688 2023/05/27 17:43:52 - mmengine - INFO - Epoch(train) [43][380/940] lr: 1.0000e-04 eta: 1:10:41 time: 0.5905 data_time: 0.0077 memory: 15293 grad_norm: 4.8794 loss: 0.5964 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5964 2023/05/27 17:44:04 - mmengine - INFO - Epoch(train) [43][400/940] lr: 1.0000e-04 eta: 1:10:29 time: 0.5880 data_time: 0.0076 memory: 15293 grad_norm: 5.0483 loss: 0.5460 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5460 2023/05/27 17:44:15 - mmengine - INFO - Epoch(train) [43][420/940] lr: 1.0000e-04 eta: 1:10:17 time: 0.5898 data_time: 0.0077 memory: 15293 grad_norm: 4.7897 loss: 0.5469 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5469 2023/05/27 17:44:27 - mmengine - INFO - Epoch(train) [43][440/940] lr: 1.0000e-04 eta: 1:10:06 time: 0.5895 data_time: 0.0077 memory: 15293 grad_norm: 4.6272 loss: 0.5839 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5839 2023/05/27 17:44:39 - mmengine - INFO - Epoch(train) [43][460/940] lr: 1.0000e-04 eta: 1:09:54 time: 0.5874 data_time: 0.0078 memory: 15293 grad_norm: 5.1514 loss: 0.6919 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6919 2023/05/27 17:44:51 - mmengine - INFO - Epoch(train) [43][480/940] lr: 1.0000e-04 eta: 1:09:42 time: 0.5872 data_time: 0.0078 memory: 15293 grad_norm: 4.5045 loss: 0.5803 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5803 2023/05/27 17:45:03 - mmengine - INFO - Epoch(train) [43][500/940] lr: 1.0000e-04 eta: 1:09:30 time: 0.5920 data_time: 0.0077 memory: 15293 grad_norm: 4.5952 loss: 0.5312 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5312 2023/05/27 17:45:14 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:45:14 - mmengine - INFO - Epoch(train) [43][520/940] lr: 1.0000e-04 eta: 1:09:18 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.8173 loss: 0.5818 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5818 2023/05/27 17:45:26 - mmengine - INFO - Epoch(train) [43][540/940] lr: 1.0000e-04 eta: 1:09:06 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 4.5126 loss: 0.6891 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.6891 2023/05/27 17:45:38 - mmengine - INFO - Epoch(train) [43][560/940] lr: 1.0000e-04 eta: 1:08:54 time: 0.5935 data_time: 0.0078 memory: 15293 grad_norm: 5.2641 loss: 0.5824 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5824 2023/05/27 17:45:50 - mmengine - INFO - Epoch(train) [43][580/940] lr: 1.0000e-04 eta: 1:08:42 time: 0.5895 data_time: 0.0075 memory: 15293 grad_norm: 4.7800 loss: 0.5846 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5846 2023/05/27 17:46:02 - mmengine - INFO - Epoch(train) [43][600/940] lr: 1.0000e-04 eta: 1:08:30 time: 0.5864 data_time: 0.0075 memory: 15293 grad_norm: 4.7478 loss: 0.6530 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6530 2023/05/27 17:46:13 - mmengine - INFO - Epoch(train) [43][620/940] lr: 1.0000e-04 eta: 1:08:18 time: 0.5899 data_time: 0.0078 memory: 15293 grad_norm: 5.9531 loss: 0.5672 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5672 2023/05/27 17:46:25 - mmengine - INFO - Epoch(train) [43][640/940] lr: 1.0000e-04 eta: 1:08:06 time: 0.5867 data_time: 0.0075 memory: 15293 grad_norm: 5.7151 loss: 0.6693 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6693 2023/05/27 17:46:37 - mmengine - INFO - Epoch(train) [43][660/940] lr: 1.0000e-04 eta: 1:07:55 time: 0.5936 data_time: 0.0077 memory: 15293 grad_norm: 4.6160 loss: 0.5416 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5416 2023/05/27 17:46:49 - mmengine - INFO - Epoch(train) [43][680/940] lr: 1.0000e-04 eta: 1:07:43 time: 0.5935 data_time: 0.0076 memory: 15293 grad_norm: 4.6003 loss: 0.5899 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5899 2023/05/27 17:47:01 - mmengine - INFO - Epoch(train) [43][700/940] lr: 1.0000e-04 eta: 1:07:31 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.7571 loss: 0.7548 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7548 2023/05/27 17:47:12 - mmengine - INFO - Epoch(train) [43][720/940] lr: 1.0000e-04 eta: 1:07:19 time: 0.5912 data_time: 0.0075 memory: 15293 grad_norm: 4.8734 loss: 0.6218 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6218 2023/05/27 17:47:24 - mmengine - INFO - Epoch(train) [43][740/940] lr: 1.0000e-04 eta: 1:07:07 time: 0.5937 data_time: 0.0078 memory: 15293 grad_norm: 4.7399 loss: 0.4471 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4471 2023/05/27 17:47:36 - mmengine - INFO - Epoch(train) [43][760/940] lr: 1.0000e-04 eta: 1:06:55 time: 0.5920 data_time: 0.0078 memory: 15293 grad_norm: 4.5658 loss: 0.5009 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5009 2023/05/27 17:47:48 - mmengine - INFO - Epoch(train) [43][780/940] lr: 1.0000e-04 eta: 1:06:43 time: 0.5867 data_time: 0.0079 memory: 15293 grad_norm: 4.3401 loss: 0.4881 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4881 2023/05/27 17:48:00 - mmengine - INFO - Epoch(train) [43][800/940] lr: 1.0000e-04 eta: 1:06:31 time: 0.5907 data_time: 0.0080 memory: 15293 grad_norm: 4.4756 loss: 0.6158 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6158 2023/05/27 17:48:11 - mmengine - INFO - Epoch(train) [43][820/940] lr: 1.0000e-04 eta: 1:06:19 time: 0.5922 data_time: 0.0077 memory: 15293 grad_norm: 6.0114 loss: 0.5848 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5848 2023/05/27 17:48:23 - mmengine - INFO - Epoch(train) [43][840/940] lr: 1.0000e-04 eta: 1:06:07 time: 0.5891 data_time: 0.0075 memory: 15293 grad_norm: 5.6357 loss: 0.6121 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6121 2023/05/27 17:48:35 - mmengine - INFO - Epoch(train) [43][860/940] lr: 1.0000e-04 eta: 1:05:55 time: 0.5865 data_time: 0.0077 memory: 15293 grad_norm: 4.7986 loss: 0.6139 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6139 2023/05/27 17:48:47 - mmengine - INFO - Epoch(train) [43][880/940] lr: 1.0000e-04 eta: 1:05:44 time: 0.5897 data_time: 0.0077 memory: 15293 grad_norm: 4.8673 loss: 0.6176 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6176 2023/05/27 17:48:59 - mmengine - INFO - Epoch(train) [43][900/940] lr: 1.0000e-04 eta: 1:05:32 time: 0.5919 data_time: 0.0077 memory: 15293 grad_norm: 5.5088 loss: 0.6079 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6079 2023/05/27 17:49:10 - mmengine - INFO - Epoch(train) [43][920/940] lr: 1.0000e-04 eta: 1:05:20 time: 0.5891 data_time: 0.0075 memory: 15293 grad_norm: 5.2554 loss: 0.7444 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7444 2023/05/27 17:49:22 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:49:22 - mmengine - INFO - Epoch(train) [43][940/940] lr: 1.0000e-04 eta: 1:05:08 time: 0.5709 data_time: 0.0076 memory: 15293 grad_norm: 4.6475 loss: 0.5267 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.5267 2023/05/27 17:49:28 - mmengine - INFO - Epoch(val) [43][20/78] eta: 0:00:16 time: 0.2879 data_time: 0.1030 memory: 2851 2023/05/27 17:49:32 - mmengine - INFO - Epoch(val) [43][40/78] eta: 0:00:09 time: 0.2237 data_time: 0.0385 memory: 2851 2023/05/27 17:49:37 - mmengine - INFO - Epoch(val) [43][60/78] eta: 0:00:04 time: 0.2426 data_time: 0.0581 memory: 2851 2023/05/27 17:49:42 - mmengine - INFO - Epoch(val) [43][78/78] acc/top1: 0.7723 acc/top5: 0.9296 acc/mean1: 0.7722 data_time: 0.0516 time: 0.2334 2023/05/27 17:49:42 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_42.pth is removed 2023/05/27 17:49:43 - mmengine - INFO - The best checkpoint with 0.7723 acc/top1 at 43 epoch is saved to best_acc_top1_epoch_43.pth. 2023/05/27 17:49:57 - mmengine - INFO - Epoch(train) [44][ 20/940] lr: 1.0000e-04 eta: 1:04:56 time: 0.6624 data_time: 0.0786 memory: 15293 grad_norm: 4.9732 loss: 0.5223 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5223 2023/05/27 17:50:08 - mmengine - INFO - Epoch(train) [44][ 40/940] lr: 1.0000e-04 eta: 1:04:44 time: 0.5915 data_time: 0.0087 memory: 15293 grad_norm: 4.7022 loss: 0.5769 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5769 2023/05/27 17:50:20 - mmengine - INFO - Epoch(train) [44][ 60/940] lr: 1.0000e-04 eta: 1:04:32 time: 0.5883 data_time: 0.0078 memory: 15293 grad_norm: 4.6082 loss: 0.5896 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5896 2023/05/27 17:50:32 - mmengine - INFO - Epoch(train) [44][ 80/940] lr: 1.0000e-04 eta: 1:04:21 time: 0.5935 data_time: 0.0079 memory: 15293 grad_norm: 4.5449 loss: 0.5642 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5642 2023/05/27 17:50:44 - mmengine - INFO - Epoch(train) [44][100/940] lr: 1.0000e-04 eta: 1:04:09 time: 0.5872 data_time: 0.0078 memory: 15293 grad_norm: 4.5837 loss: 0.5875 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5875 2023/05/27 17:50:56 - mmengine - INFO - Epoch(train) [44][120/940] lr: 1.0000e-04 eta: 1:03:57 time: 0.5915 data_time: 0.0075 memory: 15293 grad_norm: 4.6861 loss: 0.4429 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4429 2023/05/27 17:51:08 - mmengine - INFO - Epoch(train) [44][140/940] lr: 1.0000e-04 eta: 1:03:45 time: 0.5934 data_time: 0.0076 memory: 15293 grad_norm: 4.6311 loss: 0.5130 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5130 2023/05/27 17:51:19 - mmengine - INFO - Epoch(train) [44][160/940] lr: 1.0000e-04 eta: 1:03:33 time: 0.5927 data_time: 0.0077 memory: 15293 grad_norm: 4.5246 loss: 0.4705 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4705 2023/05/27 17:51:31 - mmengine - INFO - Epoch(train) [44][180/940] lr: 1.0000e-04 eta: 1:03:21 time: 0.5918 data_time: 0.0078 memory: 15293 grad_norm: 4.8640 loss: 0.8092 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.8092 2023/05/27 17:51:43 - mmengine - INFO - Epoch(train) [44][200/940] lr: 1.0000e-04 eta: 1:03:09 time: 0.5907 data_time: 0.0076 memory: 15293 grad_norm: 4.8776 loss: 0.4611 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4611 2023/05/27 17:51:55 - mmengine - INFO - Epoch(train) [44][220/940] lr: 1.0000e-04 eta: 1:02:57 time: 0.5862 data_time: 0.0079 memory: 15293 grad_norm: 5.2132 loss: 0.6136 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6136 2023/05/27 17:52:07 - mmengine - INFO - Epoch(train) [44][240/940] lr: 1.0000e-04 eta: 1:02:45 time: 0.5959 data_time: 0.0077 memory: 15293 grad_norm: 4.5967 loss: 0.5981 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5981 2023/05/27 17:52:19 - mmengine - INFO - Epoch(train) [44][260/940] lr: 1.0000e-04 eta: 1:02:34 time: 0.5956 data_time: 0.0087 memory: 15293 grad_norm: 4.7491 loss: 0.7583 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7583 2023/05/27 17:52:31 - mmengine - INFO - Epoch(train) [44][280/940] lr: 1.0000e-04 eta: 1:02:22 time: 0.5988 data_time: 0.0081 memory: 15293 grad_norm: 4.7401 loss: 0.5722 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5722 2023/05/27 17:52:42 - mmengine - INFO - Epoch(train) [44][300/940] lr: 1.0000e-04 eta: 1:02:10 time: 0.5884 data_time: 0.0077 memory: 15293 grad_norm: 4.8995 loss: 0.7795 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7795 2023/05/27 17:52:54 - mmengine - INFO - Epoch(train) [44][320/940] lr: 1.0000e-04 eta: 1:01:58 time: 0.5858 data_time: 0.0076 memory: 15293 grad_norm: 4.7855 loss: 0.6521 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6521 2023/05/27 17:53:06 - mmengine - INFO - Epoch(train) [44][340/940] lr: 1.0000e-04 eta: 1:01:46 time: 0.5988 data_time: 0.0075 memory: 15293 grad_norm: 4.4908 loss: 0.5655 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5655 2023/05/27 17:53:18 - mmengine - INFO - Epoch(train) [44][360/940] lr: 1.0000e-04 eta: 1:01:34 time: 0.5896 data_time: 0.0077 memory: 15293 grad_norm: 4.7576 loss: 0.3728 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3728 2023/05/27 17:53:30 - mmengine - INFO - Epoch(train) [44][380/940] lr: 1.0000e-04 eta: 1:01:22 time: 0.5868 data_time: 0.0079 memory: 15293 grad_norm: 4.4253 loss: 0.6028 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6028 2023/05/27 17:53:42 - mmengine - INFO - Epoch(train) [44][400/940] lr: 1.0000e-04 eta: 1:01:10 time: 0.5971 data_time: 0.0074 memory: 15293 grad_norm: 4.5132 loss: 0.4934 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4934 2023/05/27 17:53:53 - mmengine - INFO - Epoch(train) [44][420/940] lr: 1.0000e-04 eta: 1:00:58 time: 0.5955 data_time: 0.0076 memory: 15293 grad_norm: 4.5533 loss: 0.6047 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6047 2023/05/27 17:54:05 - mmengine - INFO - Epoch(train) [44][440/940] lr: 1.0000e-04 eta: 1:00:47 time: 0.5943 data_time: 0.0075 memory: 15293 grad_norm: 4.5096 loss: 0.6811 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6811 2023/05/27 17:54:17 - mmengine - INFO - Epoch(train) [44][460/940] lr: 1.0000e-04 eta: 1:00:35 time: 0.5941 data_time: 0.0076 memory: 15293 grad_norm: 4.9040 loss: 0.8212 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8212 2023/05/27 17:54:29 - mmengine - INFO - Epoch(train) [44][480/940] lr: 1.0000e-04 eta: 1:00:23 time: 0.5972 data_time: 0.0077 memory: 15293 grad_norm: 4.5410 loss: 0.5591 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5591 2023/05/27 17:54:41 - mmengine - INFO - Epoch(train) [44][500/940] lr: 1.0000e-04 eta: 1:00:11 time: 0.5931 data_time: 0.0078 memory: 15293 grad_norm: 4.5605 loss: 0.6125 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6125 2023/05/27 17:54:53 - mmengine - INFO - Epoch(train) [44][520/940] lr: 1.0000e-04 eta: 0:59:59 time: 0.5909 data_time: 0.0074 memory: 15293 grad_norm: 5.1262 loss: 0.6262 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6262 2023/05/27 17:55:05 - mmengine - INFO - Epoch(train) [44][540/940] lr: 1.0000e-04 eta: 0:59:47 time: 0.5958 data_time: 0.0079 memory: 15293 grad_norm: 4.3980 loss: 0.4904 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4904 2023/05/27 17:55:16 - mmengine - INFO - Epoch(train) [44][560/940] lr: 1.0000e-04 eta: 0:59:35 time: 0.5865 data_time: 0.0078 memory: 15293 grad_norm: 4.6601 loss: 0.5714 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5714 2023/05/27 17:55:28 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:55:28 - mmengine - INFO - Epoch(train) [44][580/940] lr: 1.0000e-04 eta: 0:59:23 time: 0.5922 data_time: 0.0077 memory: 15293 grad_norm: 5.0393 loss: 0.6556 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6556 2023/05/27 17:55:40 - mmengine - INFO - Epoch(train) [44][600/940] lr: 1.0000e-04 eta: 0:59:11 time: 0.5907 data_time: 0.0077 memory: 15293 grad_norm: 4.7461 loss: 0.7168 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7168 2023/05/27 17:55:52 - mmengine - INFO - Epoch(train) [44][620/940] lr: 1.0000e-04 eta: 0:59:00 time: 0.5929 data_time: 0.0079 memory: 15293 grad_norm: 4.5526 loss: 0.7016 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7016 2023/05/27 17:56:04 - mmengine - INFO - Epoch(train) [44][640/940] lr: 1.0000e-04 eta: 0:58:48 time: 0.5933 data_time: 0.0075 memory: 15293 grad_norm: 5.1027 loss: 0.6184 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6184 2023/05/27 17:56:16 - mmengine - INFO - Epoch(train) [44][660/940] lr: 1.0000e-04 eta: 0:58:36 time: 0.5976 data_time: 0.0081 memory: 15293 grad_norm: 4.5443 loss: 0.4655 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4655 2023/05/27 17:56:28 - mmengine - INFO - Epoch(train) [44][680/940] lr: 1.0000e-04 eta: 0:58:24 time: 0.5911 data_time: 0.0074 memory: 15293 grad_norm: 4.5769 loss: 0.3641 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.3641 2023/05/27 17:56:39 - mmengine - INFO - Epoch(train) [44][700/940] lr: 1.0000e-04 eta: 0:58:12 time: 0.5873 data_time: 0.0090 memory: 15293 grad_norm: 4.5938 loss: 0.5057 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5057 2023/05/27 17:56:51 - mmengine - INFO - Epoch(train) [44][720/940] lr: 1.0000e-04 eta: 0:58:00 time: 0.5860 data_time: 0.0077 memory: 15293 grad_norm: 4.6161 loss: 0.4919 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4919 2023/05/27 17:57:03 - mmengine - INFO - Epoch(train) [44][740/940] lr: 1.0000e-04 eta: 0:57:48 time: 0.5931 data_time: 0.0077 memory: 15293 grad_norm: 4.5567 loss: 0.7088 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7088 2023/05/27 17:57:15 - mmengine - INFO - Epoch(train) [44][760/940] lr: 1.0000e-04 eta: 0:57:36 time: 0.5867 data_time: 0.0075 memory: 15293 grad_norm: 4.6095 loss: 0.4770 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.4770 2023/05/27 17:57:26 - mmengine - INFO - Epoch(train) [44][780/940] lr: 1.0000e-04 eta: 0:57:24 time: 0.5883 data_time: 0.0075 memory: 15293 grad_norm: 4.5651 loss: 0.4052 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4052 2023/05/27 17:57:38 - mmengine - INFO - Epoch(train) [44][800/940] lr: 1.0000e-04 eta: 0:57:12 time: 0.5929 data_time: 0.0075 memory: 15293 grad_norm: 5.2889 loss: 0.7337 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.7337 2023/05/27 17:57:50 - mmengine - INFO - Epoch(train) [44][820/940] lr: 1.0000e-04 eta: 0:57:01 time: 0.5891 data_time: 0.0076 memory: 15293 grad_norm: 4.7280 loss: 0.5446 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5446 2023/05/27 17:58:02 - mmengine - INFO - Epoch(train) [44][840/940] lr: 1.0000e-04 eta: 0:56:49 time: 0.5971 data_time: 0.0075 memory: 15293 grad_norm: 4.4941 loss: 0.6504 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6504 2023/05/27 17:58:14 - mmengine - INFO - Epoch(train) [44][860/940] lr: 1.0000e-04 eta: 0:56:37 time: 0.6043 data_time: 0.0076 memory: 15293 grad_norm: 5.0726 loss: 0.5339 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5339 2023/05/27 17:58:26 - mmengine - INFO - Epoch(train) [44][880/940] lr: 1.0000e-04 eta: 0:56:25 time: 0.5858 data_time: 0.0074 memory: 15293 grad_norm: 5.0127 loss: 0.5341 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5341 2023/05/27 17:58:38 - mmengine - INFO - Epoch(train) [44][900/940] lr: 1.0000e-04 eta: 0:56:13 time: 0.5853 data_time: 0.0078 memory: 15293 grad_norm: 4.6838 loss: 0.5509 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5509 2023/05/27 17:58:49 - mmengine - INFO - Epoch(train) [44][920/940] lr: 1.0000e-04 eta: 0:56:01 time: 0.5961 data_time: 0.0076 memory: 15293 grad_norm: 4.5537 loss: 0.5424 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5424 2023/05/27 17:59:01 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 17:59:01 - mmengine - INFO - Epoch(train) [44][940/940] lr: 1.0000e-04 eta: 0:55:49 time: 0.5752 data_time: 0.0071 memory: 15293 grad_norm: 4.9665 loss: 0.6363 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6363 2023/05/27 17:59:07 - mmengine - INFO - Epoch(val) [44][20/78] eta: 0:00:17 time: 0.3031 data_time: 0.1180 memory: 2851 2023/05/27 17:59:12 - mmengine - INFO - Epoch(val) [44][40/78] eta: 0:00:10 time: 0.2255 data_time: 0.0404 memory: 2851 2023/05/27 17:59:16 - mmengine - INFO - Epoch(val) [44][60/78] eta: 0:00:04 time: 0.2336 data_time: 0.0486 memory: 2851 2023/05/27 18:00:25 - mmengine - INFO - Epoch(val) [44][78/78] acc/top1: 0.7714 acc/top5: 0.9299 acc/mean1: 0.7713 data_time: 0.0534 time: 0.2353 2023/05/27 18:00:39 - mmengine - INFO - Epoch(train) [45][ 20/940] lr: 1.0000e-04 eta: 0:55:38 time: 0.6898 data_time: 0.1093 memory: 15293 grad_norm: 4.8276 loss: 0.5996 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5996 2023/05/27 18:00:51 - mmengine - INFO - Epoch(train) [45][ 40/940] lr: 1.0000e-04 eta: 0:55:26 time: 0.5947 data_time: 0.0075 memory: 15293 grad_norm: 5.0509 loss: 0.6171 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6171 2023/05/27 18:01:03 - mmengine - INFO - Epoch(train) [45][ 60/940] lr: 1.0000e-04 eta: 0:55:14 time: 0.5875 data_time: 0.0077 memory: 15293 grad_norm: 5.0038 loss: 0.5370 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5370 2023/05/27 18:01:15 - mmengine - INFO - Epoch(train) [45][ 80/940] lr: 1.0000e-04 eta: 0:55:02 time: 0.5932 data_time: 0.0077 memory: 15293 grad_norm: 5.0674 loss: 0.4965 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4965 2023/05/27 18:01:27 - mmengine - INFO - Epoch(train) [45][100/940] lr: 1.0000e-04 eta: 0:54:50 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 4.5160 loss: 0.4245 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4245 2023/05/27 18:01:38 - mmengine - INFO - Epoch(train) [45][120/940] lr: 1.0000e-04 eta: 0:54:38 time: 0.5868 data_time: 0.0077 memory: 15293 grad_norm: 4.6200 loss: 0.6501 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6501 2023/05/27 18:01:50 - mmengine - INFO - Epoch(train) [45][140/940] lr: 1.0000e-04 eta: 0:54:26 time: 0.5981 data_time: 0.0078 memory: 15293 grad_norm: 4.8684 loss: 0.6094 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6094 2023/05/27 18:02:02 - mmengine - INFO - Epoch(train) [45][160/940] lr: 1.0000e-04 eta: 0:54:15 time: 0.5909 data_time: 0.0076 memory: 15293 grad_norm: 4.9035 loss: 0.4124 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4124 2023/05/27 18:02:14 - mmengine - INFO - Epoch(train) [45][180/940] lr: 1.0000e-04 eta: 0:54:03 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.6236 loss: 0.6746 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.6746 2023/05/27 18:02:26 - mmengine - INFO - Epoch(train) [45][200/940] lr: 1.0000e-04 eta: 0:53:51 time: 0.5919 data_time: 0.0075 memory: 15293 grad_norm: 4.8177 loss: 0.7318 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7318 2023/05/27 18:02:38 - mmengine - INFO - Epoch(train) [45][220/940] lr: 1.0000e-04 eta: 0:53:39 time: 0.6010 data_time: 0.0076 memory: 15293 grad_norm: 4.5217 loss: 0.6306 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.6306 2023/05/27 18:02:50 - mmengine - INFO - Epoch(train) [45][240/940] lr: 1.0000e-04 eta: 0:53:27 time: 0.5974 data_time: 0.0077 memory: 15293 grad_norm: 4.3622 loss: 0.4716 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.4716 2023/05/27 18:03:02 - mmengine - INFO - Epoch(train) [45][260/940] lr: 1.0000e-04 eta: 0:53:15 time: 0.5971 data_time: 0.0077 memory: 15293 grad_norm: 4.8816 loss: 0.6102 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6102 2023/05/27 18:03:14 - mmengine - INFO - Epoch(train) [45][280/940] lr: 1.0000e-04 eta: 0:53:03 time: 0.6005 data_time: 0.0077 memory: 15293 grad_norm: 5.3985 loss: 0.5837 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5837 2023/05/27 18:03:25 - mmengine - INFO - Epoch(train) [45][300/940] lr: 1.0000e-04 eta: 0:52:51 time: 0.5955 data_time: 0.0079 memory: 15293 grad_norm: 4.6758 loss: 0.4921 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4921 2023/05/27 18:03:37 - mmengine - INFO - Epoch(train) [45][320/940] lr: 1.0000e-04 eta: 0:52:40 time: 0.5904 data_time: 0.0081 memory: 15293 grad_norm: 4.5451 loss: 0.4755 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4755 2023/05/27 18:03:49 - mmengine - INFO - Epoch(train) [45][340/940] lr: 1.0000e-04 eta: 0:52:28 time: 0.5918 data_time: 0.0079 memory: 15293 grad_norm: 4.9605 loss: 0.5344 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5344 2023/05/27 18:04:01 - mmengine - INFO - Epoch(train) [45][360/940] lr: 1.0000e-04 eta: 0:52:16 time: 0.5865 data_time: 0.0078 memory: 15293 grad_norm: 5.0707 loss: 0.5164 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5164 2023/05/27 18:04:13 - mmengine - INFO - Epoch(train) [45][380/940] lr: 1.0000e-04 eta: 0:52:04 time: 0.5905 data_time: 0.0076 memory: 15293 grad_norm: 5.5126 loss: 0.4894 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4894 2023/05/27 18:04:24 - mmengine - INFO - Epoch(train) [45][400/940] lr: 1.0000e-04 eta: 0:51:52 time: 0.5860 data_time: 0.0079 memory: 15293 grad_norm: 4.7307 loss: 0.4789 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4789 2023/05/27 18:04:36 - mmengine - INFO - Epoch(train) [45][420/940] lr: 1.0000e-04 eta: 0:51:40 time: 0.5925 data_time: 0.0075 memory: 15293 grad_norm: 5.2681 loss: 0.6034 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6034 2023/05/27 18:04:48 - mmengine - INFO - Epoch(train) [45][440/940] lr: 1.0000e-04 eta: 0:51:28 time: 0.5906 data_time: 0.0078 memory: 15293 grad_norm: 4.6309 loss: 0.8013 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.8013 2023/05/27 18:05:00 - mmengine - INFO - Epoch(train) [45][460/940] lr: 1.0000e-04 eta: 0:51:16 time: 0.5922 data_time: 0.0078 memory: 15293 grad_norm: 4.8527 loss: 0.5102 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5102 2023/05/27 18:05:12 - mmengine - INFO - Epoch(train) [45][480/940] lr: 1.0000e-04 eta: 0:51:04 time: 0.5944 data_time: 0.0076 memory: 15293 grad_norm: 5.1730 loss: 0.4711 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4711 2023/05/27 18:05:24 - mmengine - INFO - Epoch(train) [45][500/940] lr: 1.0000e-04 eta: 0:50:53 time: 0.5968 data_time: 0.0075 memory: 15293 grad_norm: 4.6543 loss: 0.6767 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6767 2023/05/27 18:05:35 - mmengine - INFO - Epoch(train) [45][520/940] lr: 1.0000e-04 eta: 0:50:41 time: 0.5869 data_time: 0.0076 memory: 15293 grad_norm: 4.6365 loss: 0.3646 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.3646 2023/05/27 18:05:47 - mmengine - INFO - Epoch(train) [45][540/940] lr: 1.0000e-04 eta: 0:50:29 time: 0.5953 data_time: 0.0079 memory: 15293 grad_norm: 4.8846 loss: 0.5483 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5483 2023/05/27 18:05:59 - mmengine - INFO - Epoch(train) [45][560/940] lr: 1.0000e-04 eta: 0:50:17 time: 0.5881 data_time: 0.0077 memory: 15293 grad_norm: 5.1612 loss: 0.5776 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5776 2023/05/27 18:06:11 - mmengine - INFO - Epoch(train) [45][580/940] lr: 1.0000e-04 eta: 0:50:05 time: 0.5950 data_time: 0.0080 memory: 15293 grad_norm: 4.7815 loss: 0.5754 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5754 2023/05/27 18:06:23 - mmengine - INFO - Epoch(train) [45][600/940] lr: 1.0000e-04 eta: 0:49:53 time: 0.5867 data_time: 0.0078 memory: 15293 grad_norm: 4.9909 loss: 0.6124 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6124 2023/05/27 18:06:35 - mmengine - INFO - Epoch(train) [45][620/940] lr: 1.0000e-04 eta: 0:49:41 time: 0.5890 data_time: 0.0078 memory: 15293 grad_norm: 4.6327 loss: 0.5597 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5597 2023/05/27 18:06:46 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:06:46 - mmengine - INFO - Epoch(train) [45][640/940] lr: 1.0000e-04 eta: 0:49:29 time: 0.5874 data_time: 0.0079 memory: 15293 grad_norm: 4.3206 loss: 0.4563 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4563 2023/05/27 18:06:58 - mmengine - INFO - Epoch(train) [45][660/940] lr: 1.0000e-04 eta: 0:49:17 time: 0.5911 data_time: 0.0080 memory: 15293 grad_norm: 5.1747 loss: 0.6711 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6711 2023/05/27 18:07:10 - mmengine - INFO - Epoch(train) [45][680/940] lr: 1.0000e-04 eta: 0:49:05 time: 0.5874 data_time: 0.0076 memory: 15293 grad_norm: 4.6082 loss: 0.5717 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5717 2023/05/27 18:07:22 - mmengine - INFO - Epoch(train) [45][700/940] lr: 1.0000e-04 eta: 0:48:54 time: 0.5856 data_time: 0.0077 memory: 15293 grad_norm: 5.3722 loss: 0.7275 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7275 2023/05/27 18:07:33 - mmengine - INFO - Epoch(train) [45][720/940] lr: 1.0000e-04 eta: 0:48:42 time: 0.5949 data_time: 0.0083 memory: 15293 grad_norm: 5.2337 loss: 0.5603 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5603 2023/05/27 18:07:45 - mmengine - INFO - Epoch(train) [45][740/940] lr: 1.0000e-04 eta: 0:48:30 time: 0.5954 data_time: 0.0076 memory: 15293 grad_norm: 4.5941 loss: 0.6070 top1_acc: 0.5000 top5_acc: 0.5000 loss_cls: 0.6070 2023/05/27 18:07:57 - mmengine - INFO - Epoch(train) [45][760/940] lr: 1.0000e-04 eta: 0:48:18 time: 0.5878 data_time: 0.0077 memory: 15293 grad_norm: 4.7281 loss: 0.6717 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6717 2023/05/27 18:08:09 - mmengine - INFO - Epoch(train) [45][780/940] lr: 1.0000e-04 eta: 0:48:06 time: 0.5929 data_time: 0.0080 memory: 15293 grad_norm: 4.4588 loss: 0.5131 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5131 2023/05/27 18:08:21 - mmengine - INFO - Epoch(train) [45][800/940] lr: 1.0000e-04 eta: 0:47:54 time: 0.5884 data_time: 0.0079 memory: 15293 grad_norm: 4.4605 loss: 0.6329 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6329 2023/05/27 18:08:32 - mmengine - INFO - Epoch(train) [45][820/940] lr: 1.0000e-04 eta: 0:47:42 time: 0.5863 data_time: 0.0079 memory: 15293 grad_norm: 4.3161 loss: 0.6370 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6370 2023/05/27 18:08:44 - mmengine - INFO - Epoch(train) [45][840/940] lr: 1.0000e-04 eta: 0:47:30 time: 0.5957 data_time: 0.0080 memory: 15293 grad_norm: 5.1286 loss: 0.4748 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4748 2023/05/27 18:08:56 - mmengine - INFO - Epoch(train) [45][860/940] lr: 1.0000e-04 eta: 0:47:18 time: 0.5874 data_time: 0.0079 memory: 15293 grad_norm: 4.6563 loss: 0.6413 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6413 2023/05/27 18:09:08 - mmengine - INFO - Epoch(train) [45][880/940] lr: 1.0000e-04 eta: 0:47:07 time: 0.5866 data_time: 0.0077 memory: 15293 grad_norm: 4.4159 loss: 0.5163 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5163 2023/05/27 18:09:20 - mmengine - INFO - Epoch(train) [45][900/940] lr: 1.0000e-04 eta: 0:46:55 time: 0.5976 data_time: 0.0078 memory: 15293 grad_norm: 4.7427 loss: 0.4821 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4821 2023/05/27 18:09:32 - mmengine - INFO - Epoch(train) [45][920/940] lr: 1.0000e-04 eta: 0:46:43 time: 0.5898 data_time: 0.0075 memory: 15293 grad_norm: 4.2982 loss: 0.6149 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6149 2023/05/27 18:09:43 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:09:43 - mmengine - INFO - Epoch(train) [45][940/940] lr: 1.0000e-04 eta: 0:46:31 time: 0.5724 data_time: 0.0072 memory: 15293 grad_norm: 4.9516 loss: 0.6680 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6680 2023/05/27 18:09:43 - mmengine - INFO - Saving checkpoint at 45 epochs 2023/05/27 18:09:59 - mmengine - INFO - Epoch(val) [45][20/78] eta: 0:00:16 time: 0.2789 data_time: 0.0956 memory: 2851 2023/05/27 18:10:03 - mmengine - INFO - Epoch(val) [45][40/78] eta: 0:00:08 time: 0.1878 data_time: 0.0045 memory: 2851 2023/05/27 18:10:07 - mmengine - INFO - Epoch(val) [45][60/78] eta: 0:00:03 time: 0.1892 data_time: 0.0061 memory: 2851 2023/05/27 18:10:48 - mmengine - INFO - Epoch(val) [45][78/78] acc/top1: 0.7722 acc/top5: 0.9302 acc/mean1: 0.7721 data_time: 0.0280 time: 0.2083 2023/05/27 18:11:01 - mmengine - INFO - Epoch(train) [46][ 20/940] lr: 1.0000e-04 eta: 0:46:19 time: 0.6829 data_time: 0.0806 memory: 15293 grad_norm: 4.4313 loss: 0.4134 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4134 2023/05/27 18:11:13 - mmengine - INFO - Epoch(train) [46][ 40/940] lr: 1.0000e-04 eta: 0:46:07 time: 0.5912 data_time: 0.0078 memory: 15293 grad_norm: 5.3330 loss: 0.5424 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5424 2023/05/27 18:11:25 - mmengine - INFO - Epoch(train) [46][ 60/940] lr: 1.0000e-04 eta: 0:45:55 time: 0.5880 data_time: 0.0077 memory: 15293 grad_norm: 4.8930 loss: 0.5418 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5418 2023/05/27 18:11:37 - mmengine - INFO - Epoch(train) [46][ 80/940] lr: 1.0000e-04 eta: 0:45:44 time: 0.5901 data_time: 0.0078 memory: 15293 grad_norm: 5.1673 loss: 0.4612 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4612 2023/05/27 18:11:49 - mmengine - INFO - Epoch(train) [46][100/940] lr: 1.0000e-04 eta: 0:45:32 time: 0.5954 data_time: 0.0089 memory: 15293 grad_norm: 4.5934 loss: 0.5255 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5255 2023/05/27 18:12:01 - mmengine - INFO - Epoch(train) [46][120/940] lr: 1.0000e-04 eta: 0:45:20 time: 0.5880 data_time: 0.0079 memory: 15293 grad_norm: 5.2525 loss: 0.6350 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6350 2023/05/27 18:12:12 - mmengine - INFO - Epoch(train) [46][140/940] lr: 1.0000e-04 eta: 0:45:08 time: 0.5914 data_time: 0.0077 memory: 15293 grad_norm: 4.4645 loss: 0.6574 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6574 2023/05/27 18:12:24 - mmengine - INFO - Epoch(train) [46][160/940] lr: 1.0000e-04 eta: 0:44:56 time: 0.5865 data_time: 0.0075 memory: 15293 grad_norm: 4.5773 loss: 0.4658 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4658 2023/05/27 18:12:36 - mmengine - INFO - Epoch(train) [46][180/940] lr: 1.0000e-04 eta: 0:44:44 time: 0.5885 data_time: 0.0074 memory: 15293 grad_norm: 4.8732 loss: 0.5395 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5395 2023/05/27 18:12:48 - mmengine - INFO - Epoch(train) [46][200/940] lr: 1.0000e-04 eta: 0:44:32 time: 0.5914 data_time: 0.0076 memory: 15293 grad_norm: 4.8053 loss: 0.7301 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7301 2023/05/27 18:12:59 - mmengine - INFO - Epoch(train) [46][220/940] lr: 1.0000e-04 eta: 0:44:20 time: 0.5890 data_time: 0.0076 memory: 15293 grad_norm: 5.1229 loss: 0.4359 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4359 2023/05/27 18:13:11 - mmengine - INFO - Epoch(train) [46][240/940] lr: 1.0000e-04 eta: 0:44:08 time: 0.5901 data_time: 0.0075 memory: 15293 grad_norm: 4.4257 loss: 0.5525 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5525 2023/05/27 18:13:23 - mmengine - INFO - Epoch(train) [46][260/940] lr: 1.0000e-04 eta: 0:43:57 time: 0.5862 data_time: 0.0078 memory: 15293 grad_norm: 4.8961 loss: 0.5421 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5421 2023/05/27 18:13:35 - mmengine - INFO - Epoch(train) [46][280/940] lr: 1.0000e-04 eta: 0:43:45 time: 0.5919 data_time: 0.0077 memory: 15293 grad_norm: 4.7393 loss: 0.6305 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6305 2023/05/27 18:13:47 - mmengine - INFO - Epoch(train) [46][300/940] lr: 1.0000e-04 eta: 0:43:33 time: 0.5971 data_time: 0.0076 memory: 15293 grad_norm: 5.8981 loss: 0.5912 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5912 2023/05/27 18:13:59 - mmengine - INFO - Epoch(train) [46][320/940] lr: 1.0000e-04 eta: 0:43:21 time: 0.5953 data_time: 0.0075 memory: 15293 grad_norm: 4.7032 loss: 0.5715 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5715 2023/05/27 18:14:11 - mmengine - INFO - Epoch(train) [46][340/940] lr: 1.0000e-04 eta: 0:43:09 time: 0.6088 data_time: 0.0076 memory: 15293 grad_norm: 6.2302 loss: 0.4508 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4508 2023/05/27 18:14:23 - mmengine - INFO - Epoch(train) [46][360/940] lr: 1.0000e-04 eta: 0:42:57 time: 0.5978 data_time: 0.0076 memory: 15293 grad_norm: 4.3707 loss: 0.3604 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.3604 2023/05/27 18:14:35 - mmengine - INFO - Epoch(train) [46][380/940] lr: 1.0000e-04 eta: 0:42:45 time: 0.5864 data_time: 0.0079 memory: 15293 grad_norm: 4.7590 loss: 0.4477 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4477 2023/05/27 18:14:46 - mmengine - INFO - Epoch(train) [46][400/940] lr: 1.0000e-04 eta: 0:42:33 time: 0.5908 data_time: 0.0075 memory: 15293 grad_norm: 5.0863 loss: 0.6575 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6575 2023/05/27 18:14:58 - mmengine - INFO - Epoch(train) [46][420/940] lr: 1.0000e-04 eta: 0:42:22 time: 0.5858 data_time: 0.0077 memory: 15293 grad_norm: 4.7780 loss: 0.6722 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6722 2023/05/27 18:15:10 - mmengine - INFO - Epoch(train) [46][440/940] lr: 1.0000e-04 eta: 0:42:10 time: 0.5962 data_time: 0.0075 memory: 15293 grad_norm: 4.8382 loss: 0.6786 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6786 2023/05/27 18:15:22 - mmengine - INFO - Epoch(train) [46][460/940] lr: 1.0000e-04 eta: 0:41:58 time: 0.5870 data_time: 0.0077 memory: 15293 grad_norm: 4.9854 loss: 0.6125 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6125 2023/05/27 18:15:34 - mmengine - INFO - Epoch(train) [46][480/940] lr: 1.0000e-04 eta: 0:41:46 time: 0.5912 data_time: 0.0079 memory: 15293 grad_norm: 4.6234 loss: 0.5988 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5988 2023/05/27 18:15:45 - mmengine - INFO - Epoch(train) [46][500/940] lr: 1.0000e-04 eta: 0:41:34 time: 0.5857 data_time: 0.0075 memory: 15293 grad_norm: 4.7212 loss: 0.6345 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6345 2023/05/27 18:15:57 - mmengine - INFO - Epoch(train) [46][520/940] lr: 1.0000e-04 eta: 0:41:22 time: 0.5911 data_time: 0.0076 memory: 15293 grad_norm: 4.8839 loss: 0.6085 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6085 2023/05/27 18:16:09 - mmengine - INFO - Epoch(train) [46][540/940] lr: 1.0000e-04 eta: 0:41:10 time: 0.5867 data_time: 0.0076 memory: 15293 grad_norm: 4.8634 loss: 0.6159 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6159 2023/05/27 18:16:21 - mmengine - INFO - Epoch(train) [46][560/940] lr: 1.0000e-04 eta: 0:40:58 time: 0.5866 data_time: 0.0078 memory: 15293 grad_norm: 4.4569 loss: 0.5887 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5887 2023/05/27 18:16:32 - mmengine - INFO - Epoch(train) [46][580/940] lr: 1.0000e-04 eta: 0:40:46 time: 0.5940 data_time: 0.0075 memory: 15293 grad_norm: 4.6820 loss: 0.7016 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7016 2023/05/27 18:16:44 - mmengine - INFO - Epoch(train) [46][600/940] lr: 1.0000e-04 eta: 0:40:34 time: 0.5866 data_time: 0.0078 memory: 15293 grad_norm: 5.1253 loss: 0.5073 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5073 2023/05/27 18:16:56 - mmengine - INFO - Epoch(train) [46][620/940] lr: 1.0000e-04 eta: 0:40:23 time: 0.5917 data_time: 0.0075 memory: 15293 grad_norm: 4.4531 loss: 0.4650 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4650 2023/05/27 18:17:08 - mmengine - INFO - Epoch(train) [46][640/940] lr: 1.0000e-04 eta: 0:40:11 time: 0.5914 data_time: 0.0086 memory: 15293 grad_norm: 4.7315 loss: 0.4687 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4687 2023/05/27 18:17:20 - mmengine - INFO - Epoch(train) [46][660/940] lr: 1.0000e-04 eta: 0:39:59 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 6.2543 loss: 0.5063 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5063 2023/05/27 18:17:31 - mmengine - INFO - Epoch(train) [46][680/940] lr: 1.0000e-04 eta: 0:39:47 time: 0.5885 data_time: 0.0078 memory: 15293 grad_norm: 4.7992 loss: 0.8114 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8114 2023/05/27 18:17:43 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:17:43 - mmengine - INFO - Epoch(train) [46][700/940] lr: 1.0000e-04 eta: 0:39:35 time: 0.5943 data_time: 0.0080 memory: 15293 grad_norm: 4.6557 loss: 0.4782 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4782 2023/05/27 18:17:55 - mmengine - INFO - Epoch(train) [46][720/940] lr: 1.0000e-04 eta: 0:39:23 time: 0.5872 data_time: 0.0080 memory: 15293 grad_norm: 4.6281 loss: 0.5807 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5807 2023/05/27 18:18:07 - mmengine - INFO - Epoch(train) [46][740/940] lr: 1.0000e-04 eta: 0:39:11 time: 0.5929 data_time: 0.0077 memory: 15293 grad_norm: 4.4969 loss: 0.6644 top1_acc: 0.6250 top5_acc: 0.6250 loss_cls: 0.6644 2023/05/27 18:18:19 - mmengine - INFO - Epoch(train) [46][760/940] lr: 1.0000e-04 eta: 0:38:59 time: 0.5886 data_time: 0.0096 memory: 15293 grad_norm: 4.7402 loss: 0.5954 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5954 2023/05/27 18:18:31 - mmengine - INFO - Epoch(train) [46][780/940] lr: 1.0000e-04 eta: 0:38:47 time: 0.5902 data_time: 0.0074 memory: 15293 grad_norm: 4.4156 loss: 0.6377 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6377 2023/05/27 18:18:42 - mmengine - INFO - Epoch(train) [46][800/940] lr: 1.0000e-04 eta: 0:38:36 time: 0.5972 data_time: 0.0077 memory: 15293 grad_norm: 4.3577 loss: 0.4191 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4191 2023/05/27 18:18:54 - mmengine - INFO - Epoch(train) [46][820/940] lr: 1.0000e-04 eta: 0:38:24 time: 0.5861 data_time: 0.0076 memory: 15293 grad_norm: 4.6731 loss: 0.5846 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5846 2023/05/27 18:19:06 - mmengine - INFO - Epoch(train) [46][840/940] lr: 1.0000e-04 eta: 0:38:12 time: 0.5997 data_time: 0.0077 memory: 15293 grad_norm: 4.6545 loss: 0.4792 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4792 2023/05/27 18:19:18 - mmengine - INFO - Epoch(train) [46][860/940] lr: 1.0000e-04 eta: 0:38:00 time: 0.5973 data_time: 0.0079 memory: 15293 grad_norm: 4.9208 loss: 0.7032 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7032 2023/05/27 18:19:30 - mmengine - INFO - Epoch(train) [46][880/940] lr: 1.0000e-04 eta: 0:37:48 time: 0.5956 data_time: 0.0074 memory: 15293 grad_norm: 4.6751 loss: 0.5751 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5751 2023/05/27 18:19:42 - mmengine - INFO - Epoch(train) [46][900/940] lr: 1.0000e-04 eta: 0:37:36 time: 0.5894 data_time: 0.0077 memory: 15293 grad_norm: 5.2537 loss: 0.5714 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5714 2023/05/27 18:19:54 - mmengine - INFO - Epoch(train) [46][920/940] lr: 1.0000e-04 eta: 0:37:24 time: 0.5937 data_time: 0.0076 memory: 15293 grad_norm: 5.1250 loss: 0.6258 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6258 2023/05/27 18:20:05 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:20:05 - mmengine - INFO - Epoch(train) [46][940/940] lr: 1.0000e-04 eta: 0:37:12 time: 0.5713 data_time: 0.0075 memory: 15293 grad_norm: 4.7145 loss: 0.6542 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.6542 2023/05/27 18:20:11 - mmengine - INFO - Epoch(val) [46][20/78] eta: 0:00:17 time: 0.3081 data_time: 0.1234 memory: 2851 2023/05/27 18:20:16 - mmengine - INFO - Epoch(val) [46][40/78] eta: 0:00:09 time: 0.2134 data_time: 0.0280 memory: 2851 2023/05/27 18:20:20 - mmengine - INFO - Epoch(val) [46][60/78] eta: 0:00:04 time: 0.2218 data_time: 0.0369 memory: 2851 2023/05/27 18:20:25 - mmengine - INFO - Epoch(val) [46][78/78] acc/top1: 0.7722 acc/top5: 0.9295 acc/mean1: 0.7721 data_time: 0.0487 time: 0.2307 2023/05/27 18:20:39 - mmengine - INFO - Epoch(train) [47][ 20/940] lr: 1.0000e-04 eta: 0:37:01 time: 0.6882 data_time: 0.0745 memory: 15293 grad_norm: 4.8945 loss: 0.5854 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5854 2023/05/27 18:20:51 - mmengine - INFO - Epoch(train) [47][ 40/940] lr: 1.0000e-04 eta: 0:36:49 time: 0.5925 data_time: 0.0085 memory: 15293 grad_norm: 5.3074 loss: 0.5942 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.5942 2023/05/27 18:21:03 - mmengine - INFO - Epoch(train) [47][ 60/940] lr: 1.0000e-04 eta: 0:36:37 time: 0.5904 data_time: 0.0079 memory: 15293 grad_norm: 5.0133 loss: 0.5159 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5159 2023/05/27 18:21:15 - mmengine - INFO - Epoch(train) [47][ 80/940] lr: 1.0000e-04 eta: 0:36:25 time: 0.5891 data_time: 0.0078 memory: 15293 grad_norm: 4.5900 loss: 0.4230 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.4230 2023/05/27 18:21:27 - mmengine - INFO - Epoch(train) [47][100/940] lr: 1.0000e-04 eta: 0:36:13 time: 0.5932 data_time: 0.0079 memory: 15293 grad_norm: 4.6465 loss: 0.6900 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6900 2023/05/27 18:21:38 - mmengine - INFO - Epoch(train) [47][120/940] lr: 1.0000e-04 eta: 0:36:01 time: 0.5922 data_time: 0.0076 memory: 15293 grad_norm: 4.8154 loss: 0.4923 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4923 2023/05/27 18:21:50 - mmengine - INFO - Epoch(train) [47][140/940] lr: 1.0000e-04 eta: 0:35:49 time: 0.5906 data_time: 0.0078 memory: 15293 grad_norm: 5.0467 loss: 0.5135 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5135 2023/05/27 18:22:02 - mmengine - INFO - Epoch(train) [47][160/940] lr: 1.0000e-04 eta: 0:35:38 time: 0.5958 data_time: 0.0080 memory: 15293 grad_norm: 4.5195 loss: 0.6100 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6100 2023/05/27 18:22:14 - mmengine - INFO - Epoch(train) [47][180/940] lr: 1.0000e-04 eta: 0:35:26 time: 0.5870 data_time: 0.0077 memory: 15293 grad_norm: 7.3412 loss: 0.6853 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6853 2023/05/27 18:22:26 - mmengine - INFO - Epoch(train) [47][200/940] lr: 1.0000e-04 eta: 0:35:14 time: 0.5888 data_time: 0.0078 memory: 15293 grad_norm: 4.7497 loss: 0.5136 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5136 2023/05/27 18:22:37 - mmengine - INFO - Epoch(train) [47][220/940] lr: 1.0000e-04 eta: 0:35:02 time: 0.5939 data_time: 0.0077 memory: 15293 grad_norm: 5.0778 loss: 0.5554 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5554 2023/05/27 18:22:49 - mmengine - INFO - Epoch(train) [47][240/940] lr: 1.0000e-04 eta: 0:34:50 time: 0.5904 data_time: 0.0079 memory: 15293 grad_norm: 4.7141 loss: 0.6056 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6056 2023/05/27 18:23:01 - mmengine - INFO - Epoch(train) [47][260/940] lr: 1.0000e-04 eta: 0:34:38 time: 0.5881 data_time: 0.0076 memory: 15293 grad_norm: 5.2611 loss: 0.4416 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4416 2023/05/27 18:23:13 - mmengine - INFO - Epoch(train) [47][280/940] lr: 1.0000e-04 eta: 0:34:26 time: 0.5885 data_time: 0.0077 memory: 15293 grad_norm: 5.2622 loss: 0.4597 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.4597 2023/05/27 18:23:25 - mmengine - INFO - Epoch(train) [47][300/940] lr: 1.0000e-04 eta: 0:34:14 time: 0.5883 data_time: 0.0077 memory: 15293 grad_norm: 4.7002 loss: 0.8025 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.8025 2023/05/27 18:23:36 - mmengine - INFO - Epoch(train) [47][320/940] lr: 1.0000e-04 eta: 0:34:02 time: 0.5903 data_time: 0.0077 memory: 15293 grad_norm: 4.6221 loss: 0.5974 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5974 2023/05/27 18:23:48 - mmengine - INFO - Epoch(train) [47][340/940] lr: 1.0000e-04 eta: 0:33:51 time: 0.5904 data_time: 0.0080 memory: 15293 grad_norm: 4.9790 loss: 0.4702 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4702 2023/05/27 18:24:00 - mmengine - INFO - Epoch(train) [47][360/940] lr: 1.0000e-04 eta: 0:33:39 time: 0.5911 data_time: 0.0076 memory: 15293 grad_norm: 4.8419 loss: 0.4698 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4698 2023/05/27 18:24:12 - mmengine - INFO - Epoch(train) [47][380/940] lr: 1.0000e-04 eta: 0:33:27 time: 0.5926 data_time: 0.0077 memory: 15293 grad_norm: 4.9589 loss: 0.6233 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6233 2023/05/27 18:24:24 - mmengine - INFO - Epoch(train) [47][400/940] lr: 1.0000e-04 eta: 0:33:15 time: 0.5892 data_time: 0.0076 memory: 15293 grad_norm: 4.6039 loss: 0.6329 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6329 2023/05/27 18:24:36 - mmengine - INFO - Epoch(train) [47][420/940] lr: 1.0000e-04 eta: 0:33:03 time: 0.5956 data_time: 0.0077 memory: 15293 grad_norm: 4.6968 loss: 0.6749 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6749 2023/05/27 18:24:47 - mmengine - INFO - Epoch(train) [47][440/940] lr: 1.0000e-04 eta: 0:32:51 time: 0.5895 data_time: 0.0077 memory: 15293 grad_norm: 4.7953 loss: 0.6090 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6090 2023/05/27 18:24:59 - mmengine - INFO - Epoch(train) [47][460/940] lr: 1.0000e-04 eta: 0:32:39 time: 0.5923 data_time: 0.0076 memory: 15293 grad_norm: 4.6472 loss: 0.6555 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6555 2023/05/27 18:25:11 - mmengine - INFO - Epoch(train) [47][480/940] lr: 1.0000e-04 eta: 0:32:27 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.7361 loss: 0.6416 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6416 2023/05/27 18:25:23 - mmengine - INFO - Epoch(train) [47][500/940] lr: 1.0000e-04 eta: 0:32:16 time: 0.5901 data_time: 0.0078 memory: 15293 grad_norm: 4.4749 loss: 0.6705 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6705 2023/05/27 18:25:35 - mmengine - INFO - Epoch(train) [47][520/940] lr: 1.0000e-04 eta: 0:32:04 time: 0.5871 data_time: 0.0079 memory: 15293 grad_norm: 4.9252 loss: 0.5418 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5418 2023/05/27 18:25:47 - mmengine - INFO - Epoch(train) [47][540/940] lr: 1.0000e-04 eta: 0:31:52 time: 0.5994 data_time: 0.0078 memory: 15293 grad_norm: 4.5533 loss: 0.5126 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5126 2023/05/27 18:25:58 - mmengine - INFO - Epoch(train) [47][560/940] lr: 1.0000e-04 eta: 0:31:40 time: 0.5870 data_time: 0.0078 memory: 15293 grad_norm: 4.7290 loss: 0.7379 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7379 2023/05/27 18:26:10 - mmengine - INFO - Epoch(train) [47][580/940] lr: 1.0000e-04 eta: 0:31:28 time: 0.5867 data_time: 0.0079 memory: 15293 grad_norm: 4.6397 loss: 0.5990 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5990 2023/05/27 18:26:22 - mmengine - INFO - Epoch(train) [47][600/940] lr: 1.0000e-04 eta: 0:31:16 time: 0.5899 data_time: 0.0077 memory: 15293 grad_norm: 4.5442 loss: 0.4760 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4760 2023/05/27 18:26:34 - mmengine - INFO - Epoch(train) [47][620/940] lr: 1.0000e-04 eta: 0:31:04 time: 0.5879 data_time: 0.0078 memory: 15293 grad_norm: 5.6734 loss: 0.6023 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6023 2023/05/27 18:26:45 - mmengine - INFO - Epoch(train) [47][640/940] lr: 1.0000e-04 eta: 0:30:52 time: 0.5918 data_time: 0.0075 memory: 15293 grad_norm: 4.4391 loss: 0.5576 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5576 2023/05/27 18:26:57 - mmengine - INFO - Epoch(train) [47][660/940] lr: 1.0000e-04 eta: 0:30:40 time: 0.5897 data_time: 0.0078 memory: 15293 grad_norm: 4.5859 loss: 0.5916 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5916 2023/05/27 18:27:09 - mmengine - INFO - Epoch(train) [47][680/940] lr: 1.0000e-04 eta: 0:30:29 time: 0.5938 data_time: 0.0078 memory: 15293 grad_norm: 4.6395 loss: 0.5355 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5355 2023/05/27 18:27:21 - mmengine - INFO - Epoch(train) [47][700/940] lr: 1.0000e-04 eta: 0:30:17 time: 0.5876 data_time: 0.0075 memory: 15293 grad_norm: 4.5498 loss: 0.6356 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6356 2023/05/27 18:27:33 - mmengine - INFO - Epoch(train) [47][720/940] lr: 1.0000e-04 eta: 0:30:05 time: 0.5935 data_time: 0.0076 memory: 15293 grad_norm: 5.1320 loss: 0.4344 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4344 2023/05/27 18:27:44 - mmengine - INFO - Epoch(train) [47][740/940] lr: 1.0000e-04 eta: 0:29:53 time: 0.5861 data_time: 0.0075 memory: 15293 grad_norm: 4.6001 loss: 0.6000 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6000 2023/05/27 18:27:56 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:27:56 - mmengine - INFO - Epoch(train) [47][760/940] lr: 1.0000e-04 eta: 0:29:41 time: 0.5856 data_time: 0.0075 memory: 15293 grad_norm: 4.5376 loss: 0.5100 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5100 2023/05/27 18:28:08 - mmengine - INFO - Epoch(train) [47][780/940] lr: 1.0000e-04 eta: 0:29:29 time: 0.5972 data_time: 0.0077 memory: 15293 grad_norm: 5.2731 loss: 0.3830 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3830 2023/05/27 18:28:20 - mmengine - INFO - Epoch(train) [47][800/940] lr: 1.0000e-04 eta: 0:29:17 time: 0.5894 data_time: 0.0076 memory: 15293 grad_norm: 4.6724 loss: 0.7081 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7081 2023/05/27 18:28:32 - mmengine - INFO - Epoch(train) [47][820/940] lr: 1.0000e-04 eta: 0:29:05 time: 0.5863 data_time: 0.0078 memory: 15293 grad_norm: 4.6890 loss: 0.5321 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5321 2023/05/27 18:28:43 - mmengine - INFO - Epoch(train) [47][840/940] lr: 1.0000e-04 eta: 0:28:53 time: 0.5897 data_time: 0.0077 memory: 15293 grad_norm: 5.0613 loss: 0.5467 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5467 2023/05/27 18:28:55 - mmengine - INFO - Epoch(train) [47][860/940] lr: 1.0000e-04 eta: 0:28:42 time: 0.5929 data_time: 0.0074 memory: 15293 grad_norm: 4.7931 loss: 0.5430 top1_acc: 0.5000 top5_acc: 1.0000 loss_cls: 0.5430 2023/05/27 18:29:07 - mmengine - INFO - Epoch(train) [47][880/940] lr: 1.0000e-04 eta: 0:28:30 time: 0.5925 data_time: 0.0087 memory: 15293 grad_norm: 4.7833 loss: 0.6922 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6922 2023/05/27 18:29:19 - mmengine - INFO - Epoch(train) [47][900/940] lr: 1.0000e-04 eta: 0:28:18 time: 0.5984 data_time: 0.0078 memory: 15293 grad_norm: 4.9138 loss: 0.6400 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6400 2023/05/27 18:29:31 - mmengine - INFO - Epoch(train) [47][920/940] lr: 1.0000e-04 eta: 0:28:06 time: 0.5914 data_time: 0.0078 memory: 15293 grad_norm: 5.2212 loss: 0.5409 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5409 2023/05/27 18:29:42 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:29:42 - mmengine - INFO - Epoch(train) [47][940/940] lr: 1.0000e-04 eta: 0:27:54 time: 0.5664 data_time: 0.0077 memory: 15293 grad_norm: 4.5791 loss: 0.4995 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4995 2023/05/27 18:29:48 - mmengine - INFO - Epoch(val) [47][20/78] eta: 0:00:17 time: 0.2961 data_time: 0.1110 memory: 2851 2023/05/27 18:29:53 - mmengine - INFO - Epoch(val) [47][40/78] eta: 0:00:09 time: 0.2195 data_time: 0.0342 memory: 2851 2023/05/27 18:29:57 - mmengine - INFO - Epoch(val) [47][60/78] eta: 0:00:04 time: 0.2426 data_time: 0.0574 memory: 2851 2023/05/27 18:30:03 - mmengine - INFO - Epoch(val) [47][78/78] acc/top1: 0.7727 acc/top5: 0.9298 acc/mean1: 0.7726 data_time: 0.0523 time: 0.2346 2023/05/27 18:30:03 - mmengine - INFO - The previous best checkpoint /mnt/data/mmact/lilin/Repos/mmaction2/work_dirs/tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb/best_acc_top1_epoch_43.pth is removed 2023/05/27 18:30:04 - mmengine - INFO - The best checkpoint with 0.7727 acc/top1 at 47 epoch is saved to best_acc_top1_epoch_47.pth. 2023/05/27 18:30:17 - mmengine - INFO - Epoch(train) [48][ 20/940] lr: 1.0000e-04 eta: 0:27:42 time: 0.6659 data_time: 0.0828 memory: 15293 grad_norm: 4.9690 loss: 0.5886 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5886 2023/05/27 18:30:29 - mmengine - INFO - Epoch(train) [48][ 40/940] lr: 1.0000e-04 eta: 0:27:30 time: 0.5877 data_time: 0.0079 memory: 15293 grad_norm: 4.6928 loss: 0.6984 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6984 2023/05/27 18:30:41 - mmengine - INFO - Epoch(train) [48][ 60/940] lr: 1.0000e-04 eta: 0:27:19 time: 0.5934 data_time: 0.0075 memory: 15293 grad_norm: 4.7941 loss: 0.4787 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4787 2023/05/27 18:30:53 - mmengine - INFO - Epoch(train) [48][ 80/940] lr: 1.0000e-04 eta: 0:27:07 time: 0.5876 data_time: 0.0078 memory: 15293 grad_norm: 4.3636 loss: 0.3754 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.3754 2023/05/27 18:31:05 - mmengine - INFO - Epoch(train) [48][100/940] lr: 1.0000e-04 eta: 0:26:55 time: 0.5886 data_time: 0.0082 memory: 15293 grad_norm: 5.0381 loss: 0.5351 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5351 2023/05/27 18:31:16 - mmengine - INFO - Epoch(train) [48][120/940] lr: 1.0000e-04 eta: 0:26:43 time: 0.5936 data_time: 0.0081 memory: 15293 grad_norm: 4.8882 loss: 0.5407 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5407 2023/05/27 18:31:28 - mmengine - INFO - Epoch(train) [48][140/940] lr: 1.0000e-04 eta: 0:26:31 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.9811 loss: 0.5338 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5338 2023/05/27 18:31:40 - mmengine - INFO - Epoch(train) [48][160/940] lr: 1.0000e-04 eta: 0:26:19 time: 0.5941 data_time: 0.0092 memory: 15293 grad_norm: 5.2259 loss: 0.5477 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5477 2023/05/27 18:31:52 - mmengine - INFO - Epoch(train) [48][180/940] lr: 1.0000e-04 eta: 0:26:07 time: 0.5893 data_time: 0.0076 memory: 15293 grad_norm: 4.5858 loss: 0.7155 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7155 2023/05/27 18:32:04 - mmengine - INFO - Epoch(train) [48][200/940] lr: 1.0000e-04 eta: 0:25:55 time: 0.5986 data_time: 0.0076 memory: 15293 grad_norm: 4.5008 loss: 0.5791 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5791 2023/05/27 18:32:16 - mmengine - INFO - Epoch(train) [48][220/940] lr: 1.0000e-04 eta: 0:25:43 time: 0.5872 data_time: 0.0076 memory: 15293 grad_norm: 4.5169 loss: 0.4168 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4168 2023/05/27 18:32:28 - mmengine - INFO - Epoch(train) [48][240/940] lr: 1.0000e-04 eta: 0:25:32 time: 0.5933 data_time: 0.0074 memory: 15293 grad_norm: 4.8758 loss: 0.6105 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6105 2023/05/27 18:32:39 - mmengine - INFO - Epoch(train) [48][260/940] lr: 1.0000e-04 eta: 0:25:20 time: 0.5866 data_time: 0.0077 memory: 15293 grad_norm: 4.9484 loss: 0.5818 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5818 2023/05/27 18:32:51 - mmengine - INFO - Epoch(train) [48][280/940] lr: 1.0000e-04 eta: 0:25:08 time: 0.5913 data_time: 0.0074 memory: 15293 grad_norm: 4.6652 loss: 0.4373 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4373 2023/05/27 18:33:03 - mmengine - INFO - Epoch(train) [48][300/940] lr: 1.0000e-04 eta: 0:24:56 time: 0.5977 data_time: 0.0077 memory: 15293 grad_norm: 4.4686 loss: 0.5612 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5612 2023/05/27 18:33:15 - mmengine - INFO - Epoch(train) [48][320/940] lr: 1.0000e-04 eta: 0:24:44 time: 0.5901 data_time: 0.0077 memory: 15293 grad_norm: 4.5242 loss: 0.6554 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6554 2023/05/27 18:33:27 - mmengine - INFO - Epoch(train) [48][340/940] lr: 1.0000e-04 eta: 0:24:32 time: 0.5902 data_time: 0.0077 memory: 15293 grad_norm: 4.7786 loss: 0.7097 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7097 2023/05/27 18:33:38 - mmengine - INFO - Epoch(train) [48][360/940] lr: 1.0000e-04 eta: 0:24:20 time: 0.5863 data_time: 0.0078 memory: 15293 grad_norm: 4.4760 loss: 0.7375 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7375 2023/05/27 18:33:50 - mmengine - INFO - Epoch(train) [48][380/940] lr: 1.0000e-04 eta: 0:24:08 time: 0.5915 data_time: 0.0075 memory: 15293 grad_norm: 5.7612 loss: 0.5632 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5632 2023/05/27 18:34:02 - mmengine - INFO - Epoch(train) [48][400/940] lr: 1.0000e-04 eta: 0:23:56 time: 0.5852 data_time: 0.0075 memory: 15293 grad_norm: 7.9338 loss: 0.4741 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4741 2023/05/27 18:34:14 - mmengine - INFO - Epoch(train) [48][420/940] lr: 1.0000e-04 eta: 0:23:45 time: 0.5859 data_time: 0.0075 memory: 15293 grad_norm: 4.7407 loss: 0.5208 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5208 2023/05/27 18:34:25 - mmengine - INFO - Epoch(train) [48][440/940] lr: 1.0000e-04 eta: 0:23:33 time: 0.5898 data_time: 0.0077 memory: 15293 grad_norm: 4.4875 loss: 0.5325 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5325 2023/05/27 18:34:37 - mmengine - INFO - Epoch(train) [48][460/940] lr: 1.0000e-04 eta: 0:23:21 time: 0.5946 data_time: 0.0076 memory: 15293 grad_norm: 4.5730 loss: 0.6881 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6881 2023/05/27 18:34:49 - mmengine - INFO - Epoch(train) [48][480/940] lr: 1.0000e-04 eta: 0:23:09 time: 0.5866 data_time: 0.0076 memory: 15293 grad_norm: 5.0925 loss: 0.5356 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5356 2023/05/27 18:35:01 - mmengine - INFO - Epoch(train) [48][500/940] lr: 1.0000e-04 eta: 0:22:57 time: 0.6045 data_time: 0.0077 memory: 15293 grad_norm: 4.4415 loss: 0.5439 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5439 2023/05/27 18:35:13 - mmengine - INFO - Epoch(train) [48][520/940] lr: 1.0000e-04 eta: 0:22:45 time: 0.5963 data_time: 0.0078 memory: 15293 grad_norm: 4.6873 loss: 0.5331 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5331 2023/05/27 18:35:25 - mmengine - INFO - Epoch(train) [48][540/940] lr: 1.0000e-04 eta: 0:22:33 time: 0.5864 data_time: 0.0079 memory: 15293 grad_norm: 5.0191 loss: 0.5530 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5530 2023/05/27 18:35:37 - mmengine - INFO - Epoch(train) [48][560/940] lr: 1.0000e-04 eta: 0:22:21 time: 0.5873 data_time: 0.0077 memory: 15293 grad_norm: 4.5884 loss: 0.4741 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4741 2023/05/27 18:35:49 - mmengine - INFO - Epoch(train) [48][580/940] lr: 1.0000e-04 eta: 0:22:10 time: 0.6008 data_time: 0.0074 memory: 15293 grad_norm: 4.6774 loss: 0.4786 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4786 2023/05/27 18:36:01 - mmengine - INFO - Epoch(train) [48][600/940] lr: 1.0000e-04 eta: 0:21:58 time: 0.5974 data_time: 0.0074 memory: 15293 grad_norm: 4.9306 loss: 0.5754 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5754 2023/05/27 18:36:12 - mmengine - INFO - Epoch(train) [48][620/940] lr: 1.0000e-04 eta: 0:21:46 time: 0.5864 data_time: 0.0076 memory: 15293 grad_norm: 4.4356 loss: 0.5602 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5602 2023/05/27 18:36:24 - mmengine - INFO - Epoch(train) [48][640/940] lr: 1.0000e-04 eta: 0:21:34 time: 0.5988 data_time: 0.0078 memory: 15293 grad_norm: 4.6441 loss: 0.6825 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.6825 2023/05/27 18:36:36 - mmengine - INFO - Epoch(train) [48][660/940] lr: 1.0000e-04 eta: 0:21:22 time: 0.6057 data_time: 0.0075 memory: 15293 grad_norm: 5.1323 loss: 0.8350 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.8350 2023/05/27 18:36:48 - mmengine - INFO - Epoch(train) [48][680/940] lr: 1.0000e-04 eta: 0:21:10 time: 0.5925 data_time: 0.0076 memory: 15293 grad_norm: 5.2533 loss: 0.6775 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6775 2023/05/27 18:37:00 - mmengine - INFO - Epoch(train) [48][700/940] lr: 1.0000e-04 eta: 0:20:58 time: 0.5855 data_time: 0.0077 memory: 15293 grad_norm: 4.5606 loss: 0.5846 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5846 2023/05/27 18:37:12 - mmengine - INFO - Epoch(train) [48][720/940] lr: 1.0000e-04 eta: 0:20:46 time: 0.5902 data_time: 0.0076 memory: 15293 grad_norm: 5.0866 loss: 0.4948 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4948 2023/05/27 18:37:23 - mmengine - INFO - Epoch(train) [48][740/940] lr: 1.0000e-04 eta: 0:20:35 time: 0.5851 data_time: 0.0076 memory: 15293 grad_norm: 4.5930 loss: 0.5896 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.5896 2023/05/27 18:37:35 - mmengine - INFO - Epoch(train) [48][760/940] lr: 1.0000e-04 eta: 0:20:23 time: 0.5873 data_time: 0.0077 memory: 15293 grad_norm: 4.6718 loss: 0.5205 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5205 2023/05/27 18:37:47 - mmengine - INFO - Epoch(train) [48][780/940] lr: 1.0000e-04 eta: 0:20:11 time: 0.5856 data_time: 0.0079 memory: 15293 grad_norm: 5.0129 loss: 0.4725 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4725 2023/05/27 18:37:59 - mmengine - INFO - Epoch(train) [48][800/940] lr: 1.0000e-04 eta: 0:19:59 time: 0.5945 data_time: 0.0076 memory: 15293 grad_norm: 4.6385 loss: 0.6253 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.6253 2023/05/27 18:38:11 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:38:11 - mmengine - INFO - Epoch(train) [48][820/940] lr: 1.0000e-04 eta: 0:19:47 time: 0.5878 data_time: 0.0076 memory: 15293 grad_norm: 4.6526 loss: 0.6897 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6897 2023/05/27 18:38:22 - mmengine - INFO - Epoch(train) [48][840/940] lr: 1.0000e-04 eta: 0:19:35 time: 0.5863 data_time: 0.0075 memory: 15293 grad_norm: 5.1377 loss: 0.7254 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7254 2023/05/27 18:38:34 - mmengine - INFO - Epoch(train) [48][860/940] lr: 1.0000e-04 eta: 0:19:23 time: 0.5897 data_time: 0.0076 memory: 15293 grad_norm: 5.1426 loss: 0.5689 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5689 2023/05/27 18:38:46 - mmengine - INFO - Epoch(train) [48][880/940] lr: 1.0000e-04 eta: 0:19:11 time: 0.5979 data_time: 0.0076 memory: 15293 grad_norm: 4.7304 loss: 0.4786 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4786 2023/05/27 18:38:58 - mmengine - INFO - Epoch(train) [48][900/940] lr: 1.0000e-04 eta: 0:19:00 time: 0.5922 data_time: 0.0075 memory: 15293 grad_norm: 4.7990 loss: 0.7100 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7100 2023/05/27 18:39:10 - mmengine - INFO - Epoch(train) [48][920/940] lr: 1.0000e-04 eta: 0:18:48 time: 0.5867 data_time: 0.0075 memory: 15293 grad_norm: 5.0297 loss: 0.5907 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5907 2023/05/27 18:39:21 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:39:21 - mmengine - INFO - Epoch(train) [48][940/940] lr: 1.0000e-04 eta: 0:18:36 time: 0.5681 data_time: 0.0074 memory: 15293 grad_norm: 5.2012 loss: 0.6585 top1_acc: 0.5000 top5_acc: 0.5000 loss_cls: 0.6585 2023/05/27 18:39:21 - mmengine - INFO - Saving checkpoint at 48 epochs 2023/05/27 18:39:29 - mmengine - INFO - Epoch(val) [48][20/78] eta: 0:00:17 time: 0.3024 data_time: 0.1181 memory: 2851 2023/05/27 18:39:34 - mmengine - INFO - Epoch(val) [48][40/78] eta: 0:00:09 time: 0.2198 data_time: 0.0350 memory: 2851 2023/05/27 18:39:38 - mmengine - INFO - Epoch(val) [48][60/78] eta: 0:00:04 time: 0.2270 data_time: 0.0429 memory: 2851 2023/05/27 18:39:43 - mmengine - INFO - Epoch(val) [48][78/78] acc/top1: 0.7722 acc/top5: 0.9290 acc/mean1: 0.7720 data_time: 0.0506 time: 0.2319 2023/05/27 18:39:57 - mmengine - INFO - Epoch(train) [49][ 20/940] lr: 1.0000e-04 eta: 0:18:24 time: 0.6889 data_time: 0.0819 memory: 15293 grad_norm: 4.7485 loss: 0.8321 top1_acc: 0.5000 top5_acc: 0.8750 loss_cls: 0.8321 2023/05/27 18:40:09 - mmengine - INFO - Epoch(train) [49][ 40/940] lr: 1.0000e-04 eta: 0:18:12 time: 0.5898 data_time: 0.0078 memory: 15293 grad_norm: 4.5811 loss: 0.6459 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6459 2023/05/27 18:40:21 - mmengine - INFO - Epoch(train) [49][ 60/940] lr: 1.0000e-04 eta: 0:18:00 time: 0.6007 data_time: 0.0078 memory: 15293 grad_norm: 4.8709 loss: 0.6550 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6550 2023/05/27 18:40:32 - mmengine - INFO - Epoch(train) [49][ 80/940] lr: 1.0000e-04 eta: 0:17:48 time: 0.5898 data_time: 0.0074 memory: 15293 grad_norm: 4.2742 loss: 0.6126 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6126 2023/05/27 18:40:44 - mmengine - INFO - Epoch(train) [49][100/940] lr: 1.0000e-04 eta: 0:17:36 time: 0.5922 data_time: 0.0078 memory: 15293 grad_norm: 4.9484 loss: 0.7211 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7211 2023/05/27 18:40:56 - mmengine - INFO - Epoch(train) [49][120/940] lr: 1.0000e-04 eta: 0:17:25 time: 0.5896 data_time: 0.0079 memory: 15293 grad_norm: 4.6030 loss: 0.7351 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7351 2023/05/27 18:41:08 - mmengine - INFO - Epoch(train) [49][140/940] lr: 1.0000e-04 eta: 0:17:13 time: 0.5885 data_time: 0.0074 memory: 15293 grad_norm: 5.3375 loss: 0.5537 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5537 2023/05/27 18:41:20 - mmengine - INFO - Epoch(train) [49][160/940] lr: 1.0000e-04 eta: 0:17:01 time: 0.5963 data_time: 0.0077 memory: 15293 grad_norm: 5.5036 loss: 0.6204 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6204 2023/05/27 18:41:32 - mmengine - INFO - Epoch(train) [49][180/940] lr: 1.0000e-04 eta: 0:16:49 time: 0.5950 data_time: 0.0075 memory: 15293 grad_norm: 5.0269 loss: 0.5349 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.5349 2023/05/27 18:41:44 - mmengine - INFO - Epoch(train) [49][200/940] lr: 1.0000e-04 eta: 0:16:37 time: 0.5901 data_time: 0.0079 memory: 15293 grad_norm: 4.7204 loss: 0.5731 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5731 2023/05/27 18:41:55 - mmengine - INFO - Epoch(train) [49][220/940] lr: 1.0000e-04 eta: 0:16:25 time: 0.5937 data_time: 0.0076 memory: 15293 grad_norm: 6.7047 loss: 0.5596 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5596 2023/05/27 18:42:07 - mmengine - INFO - Epoch(train) [49][240/940] lr: 1.0000e-04 eta: 0:16:13 time: 0.5890 data_time: 0.0076 memory: 15293 grad_norm: 4.9345 loss: 0.6211 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6211 2023/05/27 18:42:19 - mmengine - INFO - Epoch(train) [49][260/940] lr: 1.0000e-04 eta: 0:16:01 time: 0.5933 data_time: 0.0076 memory: 15293 grad_norm: 4.6144 loss: 0.7013 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7013 2023/05/27 18:42:31 - mmengine - INFO - Epoch(train) [49][280/940] lr: 1.0000e-04 eta: 0:15:50 time: 0.5978 data_time: 0.0077 memory: 15293 grad_norm: 4.4889 loss: 0.5071 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5071 2023/05/27 18:42:43 - mmengine - INFO - Epoch(train) [49][300/940] lr: 1.0000e-04 eta: 0:15:38 time: 0.5856 data_time: 0.0078 memory: 15293 grad_norm: 4.6660 loss: 0.4401 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4401 2023/05/27 18:42:54 - mmengine - INFO - Epoch(train) [49][320/940] lr: 1.0000e-04 eta: 0:15:26 time: 0.5866 data_time: 0.0077 memory: 15293 grad_norm: 4.5361 loss: 0.6539 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6539 2023/05/27 18:43:06 - mmengine - INFO - Epoch(train) [49][340/940] lr: 1.0000e-04 eta: 0:15:14 time: 0.5938 data_time: 0.0078 memory: 15293 grad_norm: 4.6124 loss: 0.6992 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6992 2023/05/27 18:43:18 - mmengine - INFO - Epoch(train) [49][360/940] lr: 1.0000e-04 eta: 0:15:02 time: 0.5975 data_time: 0.0076 memory: 15293 grad_norm: 4.8634 loss: 0.6326 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6326 2023/05/27 18:43:30 - mmengine - INFO - Epoch(train) [49][380/940] lr: 1.0000e-04 eta: 0:14:50 time: 0.5891 data_time: 0.0075 memory: 15293 grad_norm: 4.6290 loss: 0.6629 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.6629 2023/05/27 18:43:42 - mmengine - INFO - Epoch(train) [49][400/940] lr: 1.0000e-04 eta: 0:14:38 time: 0.5895 data_time: 0.0076 memory: 15293 grad_norm: 4.6571 loss: 0.6336 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6336 2023/05/27 18:43:54 - mmengine - INFO - Epoch(train) [49][420/940] lr: 1.0000e-04 eta: 0:14:26 time: 0.5877 data_time: 0.0075 memory: 15293 grad_norm: 4.7339 loss: 0.9405 top1_acc: 0.6250 top5_acc: 0.6250 loss_cls: 0.9405 2023/05/27 18:44:05 - mmengine - INFO - Epoch(train) [49][440/940] lr: 1.0000e-04 eta: 0:14:15 time: 0.5894 data_time: 0.0077 memory: 15293 grad_norm: 5.1716 loss: 0.5890 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5890 2023/05/27 18:44:17 - mmengine - INFO - Epoch(train) [49][460/940] lr: 1.0000e-04 eta: 0:14:03 time: 0.5938 data_time: 0.0075 memory: 15293 grad_norm: 4.7580 loss: 0.5551 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5551 2023/05/27 18:44:29 - mmengine - INFO - Epoch(train) [49][480/940] lr: 1.0000e-04 eta: 0:13:51 time: 0.5922 data_time: 0.0076 memory: 15293 grad_norm: 4.5021 loss: 0.5276 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5276 2023/05/27 18:44:41 - mmengine - INFO - Epoch(train) [49][500/940] lr: 1.0000e-04 eta: 0:13:39 time: 0.5867 data_time: 0.0075 memory: 15293 grad_norm: 5.0485 loss: 0.7166 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.7166 2023/05/27 18:44:53 - mmengine - INFO - Epoch(train) [49][520/940] lr: 1.0000e-04 eta: 0:13:27 time: 0.5957 data_time: 0.0077 memory: 15293 grad_norm: 7.2445 loss: 0.5987 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5987 2023/05/27 18:45:05 - mmengine - INFO - Epoch(train) [49][540/940] lr: 1.0000e-04 eta: 0:13:15 time: 0.5955 data_time: 0.0076 memory: 15293 grad_norm: 4.9160 loss: 0.5144 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5144 2023/05/27 18:45:17 - mmengine - INFO - Epoch(train) [49][560/940] lr: 1.0000e-04 eta: 0:13:03 time: 0.5904 data_time: 0.0074 memory: 15293 grad_norm: 4.4867 loss: 0.5398 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5398 2023/05/27 18:45:28 - mmengine - INFO - Epoch(train) [49][580/940] lr: 1.0000e-04 eta: 0:12:51 time: 0.5860 data_time: 0.0080 memory: 15293 grad_norm: 4.4631 loss: 0.5425 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5425 2023/05/27 18:45:40 - mmengine - INFO - Epoch(train) [49][600/940] lr: 1.0000e-04 eta: 0:12:40 time: 0.5887 data_time: 0.0077 memory: 15293 grad_norm: 4.5377 loss: 0.5713 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5713 2023/05/27 18:45:52 - mmengine - INFO - Epoch(train) [49][620/940] lr: 1.0000e-04 eta: 0:12:28 time: 0.5870 data_time: 0.0078 memory: 15293 grad_norm: 4.7751 loss: 0.5078 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5078 2023/05/27 18:46:04 - mmengine - INFO - Epoch(train) [49][640/940] lr: 1.0000e-04 eta: 0:12:16 time: 0.5919 data_time: 0.0076 memory: 15293 grad_norm: 4.8028 loss: 0.5933 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5933 2023/05/27 18:46:15 - mmengine - INFO - Epoch(train) [49][660/940] lr: 1.0000e-04 eta: 0:12:04 time: 0.5864 data_time: 0.0078 memory: 15293 grad_norm: 6.0762 loss: 0.6642 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6642 2023/05/27 18:46:27 - mmengine - INFO - Epoch(train) [49][680/940] lr: 1.0000e-04 eta: 0:11:52 time: 0.5913 data_time: 0.0077 memory: 15293 grad_norm: 4.7154 loss: 0.5870 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.5870 2023/05/27 18:46:39 - mmengine - INFO - Epoch(train) [49][700/940] lr: 1.0000e-04 eta: 0:11:40 time: 0.5900 data_time: 0.0074 memory: 15293 grad_norm: 5.0802 loss: 0.6119 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6119 2023/05/27 18:46:51 - mmengine - INFO - Epoch(train) [49][720/940] lr: 1.0000e-04 eta: 0:11:28 time: 0.5861 data_time: 0.0074 memory: 15293 grad_norm: 4.2578 loss: 0.6843 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6843 2023/05/27 18:47:02 - mmengine - INFO - Epoch(train) [49][740/940] lr: 1.0000e-04 eta: 0:11:16 time: 0.5907 data_time: 0.0078 memory: 15293 grad_norm: 4.4978 loss: 0.7142 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7142 2023/05/27 18:47:15 - mmengine - INFO - Epoch(train) [49][760/940] lr: 1.0000e-04 eta: 0:11:04 time: 0.6028 data_time: 0.0077 memory: 15293 grad_norm: 4.6820 loss: 0.4327 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.4327 2023/05/27 18:47:26 - mmengine - INFO - Epoch(train) [49][780/940] lr: 1.0000e-04 eta: 0:10:53 time: 0.5900 data_time: 0.0076 memory: 15293 grad_norm: 5.2089 loss: 0.4675 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.4675 2023/05/27 18:47:38 - mmengine - INFO - Epoch(train) [49][800/940] lr: 1.0000e-04 eta: 0:10:41 time: 0.5946 data_time: 0.0077 memory: 15293 grad_norm: 4.5643 loss: 0.6873 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6873 2023/05/27 18:47:50 - mmengine - INFO - Epoch(train) [49][820/940] lr: 1.0000e-04 eta: 0:10:29 time: 0.5920 data_time: 0.0077 memory: 15293 grad_norm: 5.5470 loss: 0.7555 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.7555 2023/05/27 18:48:02 - mmengine - INFO - Epoch(train) [49][840/940] lr: 1.0000e-04 eta: 0:10:17 time: 0.5869 data_time: 0.0077 memory: 15293 grad_norm: 4.9884 loss: 0.5327 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5327 2023/05/27 18:48:14 - mmengine - INFO - Epoch(train) [49][860/940] lr: 1.0000e-04 eta: 0:10:05 time: 0.5905 data_time: 0.0078 memory: 15293 grad_norm: 4.7102 loss: 0.6847 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6847 2023/05/27 18:48:25 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:48:25 - mmengine - INFO - Epoch(train) [49][880/940] lr: 1.0000e-04 eta: 0:09:53 time: 0.5877 data_time: 0.0075 memory: 15293 grad_norm: 4.6673 loss: 0.5007 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5007 2023/05/27 18:48:37 - mmengine - INFO - Epoch(train) [49][900/940] lr: 1.0000e-04 eta: 0:09:41 time: 0.5906 data_time: 0.0078 memory: 15293 grad_norm: 4.6136 loss: 0.6418 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6418 2023/05/27 18:48:49 - mmengine - INFO - Epoch(train) [49][920/940] lr: 1.0000e-04 eta: 0:09:29 time: 0.5897 data_time: 0.0076 memory: 15293 grad_norm: 5.0514 loss: 0.5768 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5768 2023/05/27 18:49:00 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:49:00 - mmengine - INFO - Epoch(train) [49][940/940] lr: 1.0000e-04 eta: 0:09:18 time: 0.5683 data_time: 0.0076 memory: 15293 grad_norm: 4.7583 loss: 0.6887 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6887 2023/05/27 18:49:06 - mmengine - INFO - Epoch(val) [49][20/78] eta: 0:00:17 time: 0.3015 data_time: 0.1165 memory: 2851 2023/05/27 18:49:11 - mmengine - INFO - Epoch(val) [49][40/78] eta: 0:00:09 time: 0.2061 data_time: 0.0208 memory: 2851 2023/05/27 18:49:15 - mmengine - INFO - Epoch(val) [49][60/78] eta: 0:00:04 time: 0.2281 data_time: 0.0434 memory: 2851 2023/05/27 18:50:17 - mmengine - INFO - Epoch(val) [49][78/78] acc/top1: 0.7726 acc/top5: 0.9297 acc/mean1: 0.7725 data_time: 0.0468 time: 0.2288 2023/05/27 18:50:31 - mmengine - INFO - Epoch(train) [50][ 20/940] lr: 1.0000e-04 eta: 0:09:06 time: 0.6814 data_time: 0.0732 memory: 15293 grad_norm: 4.8171 loss: 0.5845 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5845 2023/05/27 18:50:43 - mmengine - INFO - Epoch(train) [50][ 40/940] lr: 1.0000e-04 eta: 0:08:54 time: 0.5855 data_time: 0.0076 memory: 15293 grad_norm: 5.1820 loss: 0.4782 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.4782 2023/05/27 18:50:55 - mmengine - INFO - Epoch(train) [50][ 60/940] lr: 1.0000e-04 eta: 0:08:42 time: 0.5903 data_time: 0.0076 memory: 15293 grad_norm: 4.8878 loss: 0.6221 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6221 2023/05/27 18:51:06 - mmengine - INFO - Epoch(train) [50][ 80/940] lr: 1.0000e-04 eta: 0:08:30 time: 0.5868 data_time: 0.0076 memory: 15293 grad_norm: 4.9672 loss: 0.6703 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6703 2023/05/27 18:51:18 - mmengine - INFO - Epoch(train) [50][100/940] lr: 1.0000e-04 eta: 0:08:18 time: 0.5965 data_time: 0.0076 memory: 15293 grad_norm: 4.5547 loss: 0.4848 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4848 2023/05/27 18:51:30 - mmengine - INFO - Epoch(train) [50][120/940] lr: 1.0000e-04 eta: 0:08:06 time: 0.5937 data_time: 0.0077 memory: 15293 grad_norm: 4.5916 loss: 0.5908 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.5908 2023/05/27 18:51:42 - mmengine - INFO - Epoch(train) [50][140/940] lr: 1.0000e-04 eta: 0:07:55 time: 0.5948 data_time: 0.0077 memory: 15293 grad_norm: 5.3841 loss: 0.3904 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3904 2023/05/27 18:51:54 - mmengine - INFO - Epoch(train) [50][160/940] lr: 1.0000e-04 eta: 0:07:43 time: 0.5929 data_time: 0.0077 memory: 15293 grad_norm: 4.9874 loss: 0.6756 top1_acc: 0.7500 top5_acc: 0.7500 loss_cls: 0.6756 2023/05/27 18:52:06 - mmengine - INFO - Epoch(train) [50][180/940] lr: 1.0000e-04 eta: 0:07:31 time: 0.5879 data_time: 0.0078 memory: 15293 grad_norm: 4.6437 loss: 0.6074 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6074 2023/05/27 18:52:17 - mmengine - INFO - Epoch(train) [50][200/940] lr: 1.0000e-04 eta: 0:07:19 time: 0.5883 data_time: 0.0079 memory: 15293 grad_norm: 4.4763 loss: 0.4718 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4718 2023/05/27 18:52:29 - mmengine - INFO - Epoch(train) [50][220/940] lr: 1.0000e-04 eta: 0:07:07 time: 0.5863 data_time: 0.0077 memory: 15293 grad_norm: 5.2224 loss: 0.6569 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6569 2023/05/27 18:52:41 - mmengine - INFO - Epoch(train) [50][240/940] lr: 1.0000e-04 eta: 0:06:55 time: 0.5855 data_time: 0.0076 memory: 15293 grad_norm: 5.0232 loss: 0.5753 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5753 2023/05/27 18:52:53 - mmengine - INFO - Epoch(train) [50][260/940] lr: 1.0000e-04 eta: 0:06:43 time: 0.5899 data_time: 0.0079 memory: 15293 grad_norm: 4.3849 loss: 0.5327 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5327 2023/05/27 18:53:05 - mmengine - INFO - Epoch(train) [50][280/940] lr: 1.0000e-04 eta: 0:06:31 time: 0.5969 data_time: 0.0074 memory: 15293 grad_norm: 4.8347 loss: 0.5182 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5182 2023/05/27 18:53:16 - mmengine - INFO - Epoch(train) [50][300/940] lr: 1.0000e-04 eta: 0:06:19 time: 0.5882 data_time: 0.0079 memory: 15293 grad_norm: 4.5745 loss: 0.5969 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5969 2023/05/27 18:53:28 - mmengine - INFO - Epoch(train) [50][320/940] lr: 1.0000e-04 eta: 0:06:08 time: 0.5907 data_time: 0.0076 memory: 15293 grad_norm: 4.9766 loss: 0.4267 top1_acc: 0.6250 top5_acc: 0.7500 loss_cls: 0.4267 2023/05/27 18:53:40 - mmengine - INFO - Epoch(train) [50][340/940] lr: 1.0000e-04 eta: 0:05:56 time: 0.5917 data_time: 0.0082 memory: 15293 grad_norm: 4.8320 loss: 0.7861 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7861 2023/05/27 18:53:52 - mmengine - INFO - Epoch(train) [50][360/940] lr: 1.0000e-04 eta: 0:05:44 time: 0.5860 data_time: 0.0076 memory: 15293 grad_norm: 4.8050 loss: 0.6713 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6713 2023/05/27 18:54:04 - mmengine - INFO - Epoch(train) [50][380/940] lr: 1.0000e-04 eta: 0:05:32 time: 0.5894 data_time: 0.0076 memory: 15293 grad_norm: 4.5692 loss: 0.6053 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6053 2023/05/27 18:54:15 - mmengine - INFO - Epoch(train) [50][400/940] lr: 1.0000e-04 eta: 0:05:20 time: 0.5858 data_time: 0.0077 memory: 15293 grad_norm: 4.7951 loss: 0.6971 top1_acc: 0.7500 top5_acc: 1.0000 loss_cls: 0.6971 2023/05/27 18:54:27 - mmengine - INFO - Epoch(train) [50][420/940] lr: 1.0000e-04 eta: 0:05:08 time: 0.5924 data_time: 0.0076 memory: 15293 grad_norm: 4.5629 loss: 0.5832 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.5832 2023/05/27 18:54:39 - mmengine - INFO - Epoch(train) [50][440/940] lr: 1.0000e-04 eta: 0:04:56 time: 0.6042 data_time: 0.0074 memory: 15293 grad_norm: 4.8821 loss: 0.6033 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6033 2023/05/27 18:54:51 - mmengine - INFO - Epoch(train) [50][460/940] lr: 1.0000e-04 eta: 0:04:44 time: 0.5946 data_time: 0.0075 memory: 15293 grad_norm: 4.4373 loss: 0.5482 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5482 2023/05/27 18:55:03 - mmengine - INFO - Epoch(train) [50][480/940] lr: 1.0000e-04 eta: 0:04:33 time: 0.5858 data_time: 0.0074 memory: 15293 grad_norm: 4.7063 loss: 0.5421 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5421 2023/05/27 18:55:15 - mmengine - INFO - Epoch(train) [50][500/940] lr: 1.0000e-04 eta: 0:04:21 time: 0.5969 data_time: 0.0077 memory: 15293 grad_norm: 5.2371 loss: 0.4274 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4274 2023/05/27 18:55:27 - mmengine - INFO - Epoch(train) [50][520/940] lr: 1.0000e-04 eta: 0:04:09 time: 0.5952 data_time: 0.0077 memory: 15293 grad_norm: 4.8040 loss: 0.6298 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6298 2023/05/27 18:55:38 - mmengine - INFO - Epoch(train) [50][540/940] lr: 1.0000e-04 eta: 0:03:57 time: 0.5911 data_time: 0.0075 memory: 15293 grad_norm: 4.7426 loss: 0.6352 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6352 2023/05/27 18:55:50 - mmengine - INFO - Epoch(train) [50][560/940] lr: 1.0000e-04 eta: 0:03:45 time: 0.5972 data_time: 0.0074 memory: 15293 grad_norm: 5.1008 loss: 0.6435 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6435 2023/05/27 18:56:02 - mmengine - INFO - Epoch(train) [50][580/940] lr: 1.0000e-04 eta: 0:03:33 time: 0.5886 data_time: 0.0076 memory: 15293 grad_norm: 4.9230 loss: 0.6304 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.6304 2023/05/27 18:56:14 - mmengine - INFO - Epoch(train) [50][600/940] lr: 1.0000e-04 eta: 0:03:21 time: 0.5910 data_time: 0.0080 memory: 15293 grad_norm: 4.6092 loss: 0.3995 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.3995 2023/05/27 18:56:26 - mmengine - INFO - Epoch(train) [50][620/940] lr: 1.0000e-04 eta: 0:03:09 time: 0.5889 data_time: 0.0076 memory: 15293 grad_norm: 4.6897 loss: 0.5191 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.5191 2023/05/27 18:56:38 - mmengine - INFO - Epoch(train) [50][640/940] lr: 1.0000e-04 eta: 0:02:58 time: 0.5944 data_time: 0.0079 memory: 15293 grad_norm: 5.1465 loss: 0.4680 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4680 2023/05/27 18:56:49 - mmengine - INFO - Epoch(train) [50][660/940] lr: 1.0000e-04 eta: 0:02:46 time: 0.5874 data_time: 0.0075 memory: 15293 grad_norm: 5.1700 loss: 0.4472 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4472 2023/05/27 18:57:01 - mmengine - INFO - Epoch(train) [50][680/940] lr: 1.0000e-04 eta: 0:02:34 time: 0.5885 data_time: 0.0078 memory: 15293 grad_norm: 4.3845 loss: 0.6418 top1_acc: 0.7500 top5_acc: 0.8750 loss_cls: 0.6418 2023/05/27 18:57:13 - mmengine - INFO - Epoch(train) [50][700/940] lr: 1.0000e-04 eta: 0:02:22 time: 0.5908 data_time: 0.0075 memory: 15293 grad_norm: 4.7247 loss: 0.5167 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.5167 2023/05/27 18:57:25 - mmengine - INFO - Epoch(train) [50][720/940] lr: 1.0000e-04 eta: 0:02:10 time: 0.5908 data_time: 0.0079 memory: 15293 grad_norm: 4.8598 loss: 0.4213 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4213 2023/05/27 18:57:37 - mmengine - INFO - Epoch(train) [50][740/940] lr: 1.0000e-04 eta: 0:01:58 time: 0.5866 data_time: 0.0076 memory: 15293 grad_norm: 4.7340 loss: 0.4958 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4958 2023/05/27 18:57:48 - mmengine - INFO - Epoch(train) [50][760/940] lr: 1.0000e-04 eta: 0:01:46 time: 0.5882 data_time: 0.0078 memory: 15293 grad_norm: 4.8116 loss: 0.4982 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4982 2023/05/27 18:58:00 - mmengine - INFO - Epoch(train) [50][780/940] lr: 1.0000e-04 eta: 0:01:34 time: 0.5932 data_time: 0.0077 memory: 15293 grad_norm: 4.4280 loss: 0.4423 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4423 2023/05/27 18:58:12 - mmengine - INFO - Epoch(train) [50][800/940] lr: 1.0000e-04 eta: 0:01:23 time: 0.5922 data_time: 0.0079 memory: 15293 grad_norm: 4.9922 loss: 0.4513 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.4513 2023/05/27 18:58:24 - mmengine - INFO - Epoch(train) [50][820/940] lr: 1.0000e-04 eta: 0:01:11 time: 0.5888 data_time: 0.0076 memory: 15293 grad_norm: 4.4537 loss: 0.7711 top1_acc: 0.8750 top5_acc: 1.0000 loss_cls: 0.7711 2023/05/27 18:58:36 - mmengine - INFO - Epoch(train) [50][840/940] lr: 1.0000e-04 eta: 0:00:59 time: 0.5863 data_time: 0.0079 memory: 15293 grad_norm: 5.9997 loss: 0.6432 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.6432 2023/05/27 18:58:47 - mmengine - INFO - Epoch(train) [50][860/940] lr: 1.0000e-04 eta: 0:00:47 time: 0.5911 data_time: 0.0077 memory: 15293 grad_norm: 4.2905 loss: 0.4367 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.4367 2023/05/27 18:58:59 - mmengine - INFO - Epoch(train) [50][880/940] lr: 1.0000e-04 eta: 0:00:35 time: 0.5866 data_time: 0.0075 memory: 15293 grad_norm: 4.9663 loss: 0.7607 top1_acc: 0.8750 top5_acc: 0.8750 loss_cls: 0.7607 2023/05/27 18:59:11 - mmengine - INFO - Epoch(train) [50][900/940] lr: 1.0000e-04 eta: 0:00:23 time: 0.5979 data_time: 0.0078 memory: 15293 grad_norm: 4.5981 loss: 0.7976 top1_acc: 0.6250 top5_acc: 1.0000 loss_cls: 0.7976 2023/05/27 18:59:23 - mmengine - INFO - Epoch(train) [50][920/940] lr: 1.0000e-04 eta: 0:00:11 time: 0.5927 data_time: 0.0077 memory: 15293 grad_norm: 4.7820 loss: 0.7375 top1_acc: 0.6250 top5_acc: 0.8750 loss_cls: 0.7375 2023/05/27 18:59:34 - mmengine - INFO - Exp name: tsn_imagenet-pretrained-swin-transformer_32xb8-1x1x8-50e_kinetics400-rgb_20230527_140306 2023/05/27 18:59:34 - mmengine - INFO - Epoch(train) [50][940/940] lr: 1.0000e-04 eta: 0:00:00 time: 0.5773 data_time: 0.0073 memory: 15293 grad_norm: 4.7753 loss: 0.5888 top1_acc: 1.0000 top5_acc: 1.0000 loss_cls: 0.5888 2023/05/27 18:59:34 - mmengine - INFO - Saving checkpoint at 50 epochs 2023/05/27 19:00:13 - mmengine - INFO - Epoch(val) [50][20/78] eta: 0:00:14 time: 0.2484 data_time: 0.0670 memory: 2851 2023/05/27 19:00:17 - mmengine - INFO - Epoch(val) [50][40/78] eta: 0:00:08 time: 0.1863 data_time: 0.0046 memory: 2851 2023/05/27 19:00:20 - mmengine - INFO - Epoch(val) [50][60/78] eta: 0:00:03 time: 0.1871 data_time: 0.0052 memory: 2851 2023/05/27 19:01:34 - mmengine - INFO - Epoch(val) [50][78/78] acc/top1: 0.7711 acc/top5: 0.9296 acc/mean1: 0.7710 data_time: 0.0206 time: 0.1995