采用Resnet做UC Merced Land Use Dataset数据分类

研0的苦逼生活

这次主要说说模型吧

我用的UC Merced Land Use Dataset这个数据集做的,说来话长1000张图,正确率75,也不知道是咋回事,resnet-101正确率不如resnet-18按理来说,模型复杂了,但正确率下降了

ResidualBlock

class ResidualBlock(nn.Module):
    def __init__(self, in_channels, out_channels):
        super().__init__()
        self.in_channels, self.out_channels =  in_channels, out_channels
        self.blocks = nn.Identity()
        self.shortcut = nn.Identity()

    def forward(self, x):
        residual = x
        if self.should_apply_shortcut: residual = self.shortcut(x)
        x = self.blocks(x)
        x += residual
        return x

    def should_apply_shortcut(self):
        return self.in_channels != self.out_channels

这个 nn.Identity()的意义就和这个差不多

采用Resnet做UC Merced Land Use Dataset数据分类
跳过这个弯的曲线直接走底下这两层
采用Resnet做UC Merced Land Use Dataset数据分类
from collections import OrderedDict

class ResNetResidualBlock(ResidualBlock):
    def __init__(self, in_channels, out_channels, expansion=1, downsampling=1, conv=conv3x3, *args, **kwargs):
        super().__init__(in_channels, out_channels)
        self.expansion, self.downsampling, self.conv = expansion, downsampling, conv
        self.shortcut = nn.Sequential(OrderedDict(
        {
            'conv' : nn.Conv2d(self.in_channels, self.expanded_channels, kernel_size=1,
                      stride=self.downsampling, bias=False),
            'bn' : nn.BatchNorm2d(self.expanded_channels)

        })) if self.should_apply_shortcut else None

    def expanded_channels(self):
        return self.out_channels * self.expansion

    def should_apply_shortcut(self):
        return self.in_channels != self.expanded_channels

其实这边就还好理解,做卷积和正则然后看输入和输出的是否一致。

class ResNetEncoder(nn.Module):

    def __init__(self, in_channels=3, blocks_sizes=[64, 128, 256, 512], deepths=[2,2,2,2],
                 activation=nn.ReLU, block=ResNetBasicBlock, *args,**kwargs):
        super().__init__()

        self.blocks_sizes = blocks_sizes

        self.gate = nn.Sequential(
            nn.Conv2d(in_channels, self.blocks_sizes[0], kernel_size=7, stride=2, padding=3, bias=False),
            nn.BatchNorm2d(self.blocks_sizes[0]),
            activation(),
            nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
        )

        self.in_out_block_sizes = list(zip(blocks_sizes, blocks_sizes[1:]))
        self.blocks = nn.ModuleList([
            ResNetLayer(blocks_sizes[0], blocks_sizes[0], n=deepths[0], activation=activation,
                        block=block,  *args, **kwargs),
            *[ResNetLayer(in_channels * block.expansion,
                          out_channels, n=n, activation=activation,
                          block=block, *args, **kwargs)
              for (in_channels, out_channels), n in zip(self.in_out_block_sizes, deepths[1:])]
        ])

    def forward(self, x):
        x = self.gate(x)
        for block in self.blocks:
            x = block(x)
        return x
class ResnetDecoder(nn.Module):

    def __init__(self, in_features, n_classes):
        super().__init__()
        self.avg = nn.AdaptiveAvgPool2d((1, 1))
        self.decoder = nn.Linear(in_features, n_classes)

    def forward(self, x):
        x = self.avg(x)
        x = x.view(x.size(0), -1)
        x = self.decoder(x)
        return x
class ResNet(nn.Module):

    def __init__(self, in_channels, n_classes, *args, **kwargs):
        super().__init__()
        self.encoder = ResNetEncoder(in_channels, *args, **kwargs)
        self.decoder = ResnetDecoder(self.encoder.blocks[-1].blocks[-1].expanded_channels, n_classes)

    def forward(self, x):
        x = self.encoder(x)
        x = self.decoder(x)
        return x
    def params_count(model):
    Args:
        model (model): model to count the number of parameters.

        return np.sum([p.numel() for p in model.parameters()]).item()

基本已经结束了。
我看这个模型的源码给的deeps我不知道啥意思,为啥这么算

def resnet18(in_channels, n_classes):
    return ResNet(in_channels, n_classes, block=ResNetBasicBlock, deepths=[2, 2, 2, 2])

def resnet34(in_channels, n_classes):
    return ResNet(in_channels, n_classes, block=ResNetBasicBlock, deepths=[3, 4, 6, 3])

def resnet50(in_channels, n_classes):
    return ResNet(in_channels, n_classes, block=ResNetBottleNeckBlock, deepths=[3, 4, 6, 3])

def resnet101(in_channels, n_classes):
    return ResNet(in_channels, n_classes, block=ResNetBottleNeckBlock, deepths=[3, 4, 23, 3])

def resnet152(in_channels, n_classes):
    return ResNet(in_channels, n_classes, block=ResNetBottleNeckBlock, deepths=[3, 8, 36, 3])

综上是模型的全部了,调用的时候要记得

model = resnet101(in_channels,n_classes).cuda()
print('params num:', params_count(model))

不用再直接调用类了
原模型是在github上找的FrancescoSaverioZuppichini
如果需要看看这个网络模型结构

model = resnet18(3, 1000)
summary(model.cuda(), (3, 1024,1024))
   Layer (type)               Output Shape         Param
================================================================
            Conv2d-1         [-1, 64, 512, 512]           9,408
       BatchNorm2d-2         [-1, 64, 512, 512]             128
              ReLU-3         [-1, 64, 512, 512]               0
         MaxPool2d-4         [-1, 64, 256, 256]               0
            Conv2d-5        [-1, 256, 256, 256]          16,384
       BatchNorm2d-6        [-1, 256, 256, 256]             512
        Conv2dAuto-7         [-1, 64, 256, 256]           4,096
       BatchNorm2d-8         [-1, 64, 256, 256]             128
              ReLU-9         [-1, 64, 256, 256]               0
       Conv2dAuto-10         [-1, 64, 256, 256]          36,864
      BatchNorm2d-11         [-1, 64, 256, 256]             128
             ReLU-12         [-1, 64, 256, 256]               0
       Conv2dAuto-13        [-1, 256, 256, 256]          16,384
      BatchNorm2d-14        [-1, 256, 256, 256]             512
ResNetBottleNeckBlock-15        [-1, 256, 256, 256]               0
       Conv2dAuto-16         [-1, 64, 256, 256]          16,384
      BatchNorm2d-17         [-1, 64, 256, 256]             128
             ReLU-18         [-1, 64, 256, 256]               0
       Conv2dAuto-19         [-1, 64, 256, 256]          36,864
      BatchNorm2d-20         [-1, 64, 256, 256]             128
             ReLU-21         [-1, 64, 256, 256]               0
       Conv2dAuto-22        [-1, 256, 256, 256]          16,384
      BatchNorm2d-23        [-1, 256, 256, 256]             512
ResNetBottleNeckBlock-24        [-1, 256, 256, 256]               0
       Conv2dAuto-25         [-1, 64, 256, 256]          16,384
      BatchNorm2d-26         [-1, 64, 256, 256]             128
             ReLU-27         [-1, 64, 256, 256]               0
       Conv2dAuto-28         [-1, 64, 256, 256]          36,864
      BatchNorm2d-29         [-1, 64, 256, 256]             128
             ReLU-30         [-1, 64, 256, 256]               0
       Conv2dAuto-31        [-1, 256, 256, 256]          16,384
      BatchNorm2d-32        [-1, 256, 256, 256]             512
ResNetBottleNeckBlock-33        [-1, 256, 256, 256]               0
      ResNetLayer-34        [-1, 256, 256, 256]               0
           Conv2d-35        [-1, 512, 128, 128]         131,072
      BatchNorm2d-36        [-1, 512, 128, 128]           1,024
       Conv2dAuto-37        [-1, 128, 256, 256]          32,768
      BatchNorm2d-38        [-1, 128, 256, 256]             256
             ReLU-39        [-1, 128, 256, 256]               0
       Conv2dAuto-40        [-1, 128, 128, 128]         147,456
      BatchNorm2d-41        [-1, 128, 128, 128]             256
             ReLU-42        [-1, 128, 128, 128]               0
       Conv2dAuto-43        [-1, 512, 128, 128]          65,536
      BatchNorm2d-44        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-45        [-1, 512, 128, 128]               0
       Conv2dAuto-46        [-1, 128, 128, 128]          65,536
      BatchNorm2d-47        [-1, 128, 128, 128]             256
             ReLU-48        [-1, 128, 128, 128]               0
       Conv2dAuto-49        [-1, 128, 128, 128]         147,456
      BatchNorm2d-50        [-1, 128, 128, 128]             256
             ReLU-51        [-1, 128, 128, 128]               0
       Conv2dAuto-52        [-1, 512, 128, 128]          65,536
      BatchNorm2d-53        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-54        [-1, 512, 128, 128]               0
       Conv2dAuto-55        [-1, 128, 128, 128]          65,536
      BatchNorm2d-56        [-1, 128, 128, 128]             256
             ReLU-57        [-1, 128, 128, 128]               0
       Conv2dAuto-58        [-1, 128, 128, 128]         147,456
      BatchNorm2d-59        [-1, 128, 128, 128]             256
             ReLU-60        [-1, 128, 128, 128]               0
       Conv2dAuto-61        [-1, 512, 128, 128]          65,536
      BatchNorm2d-62        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-63        [-1, 512, 128, 128]               0
       Conv2dAuto-64        [-1, 128, 128, 128]          65,536
      BatchNorm2d-65        [-1, 128, 128, 128]             256
             ReLU-66        [-1, 128, 128, 128]               0
       Conv2dAuto-67        [-1, 128, 128, 128]         147,456
      BatchNorm2d-68        [-1, 128, 128, 128]             256
             ReLU-69        [-1, 128, 128, 128]               0
       Conv2dAuto-70        [-1, 512, 128, 128]          65,536
      BatchNorm2d-71        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-72        [-1, 512, 128, 128]               0
       Conv2dAuto-73        [-1, 128, 128, 128]          65,536
      BatchNorm2d-74        [-1, 128, 128, 128]             256
             ReLU-75        [-1, 128, 128, 128]               0
       Conv2dAuto-76        [-1, 128, 128, 128]         147,456
      BatchNorm2d-77        [-1, 128, 128, 128]             256
             ReLU-78        [-1, 128, 128, 128]               0
       Conv2dAuto-79        [-1, 512, 128, 128]          65,536
      BatchNorm2d-80        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-81        [-1, 512, 128, 128]               0
       Conv2dAuto-82        [-1, 128, 128, 128]          65,536
      BatchNorm2d-83        [-1, 128, 128, 128]             256
             ReLU-84        [-1, 128, 128, 128]               0
       Conv2dAuto-85        [-1, 128, 128, 128]         147,456
      BatchNorm2d-86        [-1, 128, 128, 128]             256
             ReLU-87        [-1, 128, 128, 128]               0
       Conv2dAuto-88        [-1, 512, 128, 128]          65,536
      BatchNorm2d-89        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-90        [-1, 512, 128, 128]               0
       Conv2dAuto-91        [-1, 128, 128, 128]          65,536
      BatchNorm2d-92        [-1, 128, 128, 128]             256
             ReLU-93        [-1, 128, 128, 128]               0
       Conv2dAuto-94        [-1, 128, 128, 128]         147,456
      BatchNorm2d-95        [-1, 128, 128, 128]             256
             ReLU-96        [-1, 128, 128, 128]               0
       Conv2dAuto-97        [-1, 512, 128, 128]          65,536
      BatchNorm2d-98        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-99        [-1, 512, 128, 128]               0
      Conv2dAuto-100        [-1, 128, 128, 128]          65,536
     BatchNorm2d-101        [-1, 128, 128, 128]             256
            ReLU-102        [-1, 128, 128, 128]               0
      Conv2dAuto-103        [-1, 128, 128, 128]         147,456
     BatchNorm2d-104        [-1, 128, 128, 128]             256
            ReLU-105        [-1, 128, 128, 128]               0
      Conv2dAuto-106        [-1, 512, 128, 128]          65,536
     BatchNorm2d-107        [-1, 512, 128, 128]           1,024
ResNetBottleNeckBlock-108        [-1, 512, 128, 128]               0
     ResNetLayer-109        [-1, 512, 128, 128]               0
          Conv2d-110         [-1, 1024, 64, 64]         524,288
     BatchNorm2d-111         [-1, 1024, 64, 64]           2,048
      Conv2dAuto-112        [-1, 256, 128, 128]         131,072
     BatchNorm2d-113        [-1, 256, 128, 128]             512
            ReLU-114        [-1, 256, 128, 128]               0
      Conv2dAuto-115          [-1, 256, 64, 64]         589,824
     BatchNorm2d-116          [-1, 256, 64, 64]             512
            ReLU-117          [-1, 256, 64, 64]               0
      Conv2dAuto-118         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-119         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-120         [-1, 1024, 64, 64]               0
      Conv2dAuto-121          [-1, 256, 64, 64]         262,144
     BatchNorm2d-122          [-1, 256, 64, 64]             512
            ReLU-123          [-1, 256, 64, 64]               0
      Conv2dAuto-124          [-1, 256, 64, 64]         589,824
     BatchNorm2d-125          [-1, 256, 64, 64]             512
            ReLU-126          [-1, 256, 64, 64]               0
      Conv2dAuto-127         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-128         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-129         [-1, 1024, 64, 64]               0
      Conv2dAuto-130          [-1, 256, 64, 64]         262,144
     BatchNorm2d-131          [-1, 256, 64, 64]             512
            ReLU-132          [-1, 256, 64, 64]               0
      Conv2dAuto-133          [-1, 256, 64, 64]         589,824
     BatchNorm2d-134          [-1, 256, 64, 64]             512
            ReLU-135          [-1, 256, 64, 64]               0
      Conv2dAuto-136         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-137         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-138         [-1, 1024, 64, 64]               0
      Conv2dAuto-139          [-1, 256, 64, 64]         262,144
     BatchNorm2d-140          [-1, 256, 64, 64]             512
            ReLU-141          [-1, 256, 64, 64]               0
      Conv2dAuto-142          [-1, 256, 64, 64]         589,824
     BatchNorm2d-143          [-1, 256, 64, 64]             512
            ReLU-144          [-1, 256, 64, 64]               0
      Conv2dAuto-145         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-146         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-147         [-1, 1024, 64, 64]               0
      Conv2dAuto-148          [-1, 256, 64, 64]         262,144
     BatchNorm2d-149          [-1, 256, 64, 64]             512
            ReLU-150          [-1, 256, 64, 64]               0
      Conv2dAuto-151          [-1, 256, 64, 64]         589,824
     BatchNorm2d-152          [-1, 256, 64, 64]             512
            ReLU-153          [-1, 256, 64, 64]               0
      Conv2dAuto-154         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-155         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-156         [-1, 1024, 64, 64]               0
      Conv2dAuto-157          [-1, 256, 64, 64]         262,144
     BatchNorm2d-158          [-1, 256, 64, 64]             512
            ReLU-159          [-1, 256, 64, 64]               0
      Conv2dAuto-160          [-1, 256, 64, 64]         589,824
     BatchNorm2d-161          [-1, 256, 64, 64]             512
            ReLU-162          [-1, 256, 64, 64]               0
      Conv2dAuto-163         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-164         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-165         [-1, 1024, 64, 64]               0
      Conv2dAuto-166          [-1, 256, 64, 64]         262,144
     BatchNorm2d-167          [-1, 256, 64, 64]             512
            ReLU-168          [-1, 256, 64, 64]               0
      Conv2dAuto-169          [-1, 256, 64, 64]         589,824
     BatchNorm2d-170          [-1, 256, 64, 64]             512
            ReLU-171          [-1, 256, 64, 64]               0
      Conv2dAuto-172         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-173         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-174         [-1, 1024, 64, 64]               0
      Conv2dAuto-175          [-1, 256, 64, 64]         262,144
     BatchNorm2d-176          [-1, 256, 64, 64]             512
            ReLU-177          [-1, 256, 64, 64]               0
      Conv2dAuto-178          [-1, 256, 64, 64]         589,824
     BatchNorm2d-179          [-1, 256, 64, 64]             512
            ReLU-180          [-1, 256, 64, 64]               0
      Conv2dAuto-181         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-182         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-183         [-1, 1024, 64, 64]               0
      Conv2dAuto-184          [-1, 256, 64, 64]         262,144
     BatchNorm2d-185          [-1, 256, 64, 64]             512
            ReLU-186          [-1, 256, 64, 64]               0
      Conv2dAuto-187          [-1, 256, 64, 64]         589,824
     BatchNorm2d-188          [-1, 256, 64, 64]             512
            ReLU-189          [-1, 256, 64, 64]               0
      Conv2dAuto-190         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-191         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-192         [-1, 1024, 64, 64]               0
      Conv2dAuto-193          [-1, 256, 64, 64]         262,144
     BatchNorm2d-194          [-1, 256, 64, 64]             512
            ReLU-195          [-1, 256, 64, 64]               0
      Conv2dAuto-196          [-1, 256, 64, 64]         589,824
     BatchNorm2d-197          [-1, 256, 64, 64]             512
            ReLU-198          [-1, 256, 64, 64]               0
      Conv2dAuto-199         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-200         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-201         [-1, 1024, 64, 64]               0
      Conv2dAuto-202          [-1, 256, 64, 64]         262,144
     BatchNorm2d-203          [-1, 256, 64, 64]             512
            ReLU-204          [-1, 256, 64, 64]               0
      Conv2dAuto-205          [-1, 256, 64, 64]         589,824
     BatchNorm2d-206          [-1, 256, 64, 64]             512
            ReLU-207          [-1, 256, 64, 64]               0
      Conv2dAuto-208         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-209         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-210         [-1, 1024, 64, 64]               0
      Conv2dAuto-211          [-1, 256, 64, 64]         262,144
     BatchNorm2d-212          [-1, 256, 64, 64]             512
            ReLU-213          [-1, 256, 64, 64]               0
      Conv2dAuto-214          [-1, 256, 64, 64]         589,824
     BatchNorm2d-215          [-1, 256, 64, 64]             512
            ReLU-216          [-1, 256, 64, 64]               0
      Conv2dAuto-217         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-218         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-219         [-1, 1024, 64, 64]               0
      Conv2dAuto-220          [-1, 256, 64, 64]         262,144
     BatchNorm2d-221          [-1, 256, 64, 64]             512
            ReLU-222          [-1, 256, 64, 64]               0
      Conv2dAuto-223          [-1, 256, 64, 64]         589,824
     BatchNorm2d-224          [-1, 256, 64, 64]             512
            ReLU-225          [-1, 256, 64, 64]               0
      Conv2dAuto-226         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-227         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-228         [-1, 1024, 64, 64]               0
      Conv2dAuto-229          [-1, 256, 64, 64]         262,144
     BatchNorm2d-230          [-1, 256, 64, 64]             512
            ReLU-231          [-1, 256, 64, 64]               0
      Conv2dAuto-232          [-1, 256, 64, 64]         589,824
     BatchNorm2d-233          [-1, 256, 64, 64]             512
            ReLU-234          [-1, 256, 64, 64]               0
      Conv2dAuto-235         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-236         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-237         [-1, 1024, 64, 64]               0
      Conv2dAuto-238          [-1, 256, 64, 64]         262,144
     BatchNorm2d-239          [-1, 256, 64, 64]             512
            ReLU-240          [-1, 256, 64, 64]               0
      Conv2dAuto-241          [-1, 256, 64, 64]         589,824
     BatchNorm2d-242          [-1, 256, 64, 64]             512
            ReLU-243          [-1, 256, 64, 64]               0
      Conv2dAuto-244         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-245         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-246         [-1, 1024, 64, 64]               0
      Conv2dAuto-247          [-1, 256, 64, 64]         262,144
     BatchNorm2d-248          [-1, 256, 64, 64]             512
            ReLU-249          [-1, 256, 64, 64]               0
      Conv2dAuto-250          [-1, 256, 64, 64]         589,824
     BatchNorm2d-251          [-1, 256, 64, 64]             512
            ReLU-252          [-1, 256, 64, 64]               0
      Conv2dAuto-253         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-254         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-255         [-1, 1024, 64, 64]               0
      Conv2dAuto-256          [-1, 256, 64, 64]         262,144
     BatchNorm2d-257          [-1, 256, 64, 64]             512
            ReLU-258          [-1, 256, 64, 64]               0
      Conv2dAuto-259          [-1, 256, 64, 64]         589,824
     BatchNorm2d-260          [-1, 256, 64, 64]             512
            ReLU-261          [-1, 256, 64, 64]               0
      Conv2dAuto-262         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-263         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-264         [-1, 1024, 64, 64]               0
      Conv2dAuto-265          [-1, 256, 64, 64]         262,144
     BatchNorm2d-266          [-1, 256, 64, 64]             512
            ReLU-267          [-1, 256, 64, 64]               0
      Conv2dAuto-268          [-1, 256, 64, 64]         589,824
     BatchNorm2d-269          [-1, 256, 64, 64]             512
            ReLU-270          [-1, 256, 64, 64]               0
      Conv2dAuto-271         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-272         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-273         [-1, 1024, 64, 64]               0
      Conv2dAuto-274          [-1, 256, 64, 64]         262,144
     BatchNorm2d-275          [-1, 256, 64, 64]             512
            ReLU-276          [-1, 256, 64, 64]               0
      Conv2dAuto-277          [-1, 256, 64, 64]         589,824
     BatchNorm2d-278          [-1, 256, 64, 64]             512
            ReLU-279          [-1, 256, 64, 64]               0
      Conv2dAuto-280         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-281         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-282         [-1, 1024, 64, 64]               0
      Conv2dAuto-283          [-1, 256, 64, 64]         262,144
     BatchNorm2d-284          [-1, 256, 64, 64]             512
            ReLU-285          [-1, 256, 64, 64]               0
      Conv2dAuto-286          [-1, 256, 64, 64]         589,824
     BatchNorm2d-287          [-1, 256, 64, 64]             512
            ReLU-288          [-1, 256, 64, 64]               0
      Conv2dAuto-289         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-290         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-291         [-1, 1024, 64, 64]               0
      Conv2dAuto-292          [-1, 256, 64, 64]         262,144
     BatchNorm2d-293          [-1, 256, 64, 64]             512
            ReLU-294          [-1, 256, 64, 64]               0
      Conv2dAuto-295          [-1, 256, 64, 64]         589,824
     BatchNorm2d-296          [-1, 256, 64, 64]             512
            ReLU-297          [-1, 256, 64, 64]               0
      Conv2dAuto-298         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-299         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-300         [-1, 1024, 64, 64]               0
      Conv2dAuto-301          [-1, 256, 64, 64]         262,144
     BatchNorm2d-302          [-1, 256, 64, 64]             512
            ReLU-303          [-1, 256, 64, 64]               0
      Conv2dAuto-304          [-1, 256, 64, 64]         589,824
     BatchNorm2d-305          [-1, 256, 64, 64]             512
            ReLU-306          [-1, 256, 64, 64]               0
      Conv2dAuto-307         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-308         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-309         [-1, 1024, 64, 64]               0
      Conv2dAuto-310          [-1, 256, 64, 64]         262,144
     BatchNorm2d-311          [-1, 256, 64, 64]             512
            ReLU-312          [-1, 256, 64, 64]               0
      Conv2dAuto-313          [-1, 256, 64, 64]         589,824
     BatchNorm2d-314          [-1, 256, 64, 64]             512
            ReLU-315          [-1, 256, 64, 64]               0
      Conv2dAuto-316         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-317         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-318         [-1, 1024, 64, 64]               0
      Conv2dAuto-319          [-1, 256, 64, 64]         262,144
     BatchNorm2d-320          [-1, 256, 64, 64]             512
            ReLU-321          [-1, 256, 64, 64]               0
      Conv2dAuto-322          [-1, 256, 64, 64]         589,824
     BatchNorm2d-323          [-1, 256, 64, 64]             512
            ReLU-324          [-1, 256, 64, 64]               0
      Conv2dAuto-325         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-326         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-327         [-1, 1024, 64, 64]               0
      Conv2dAuto-328          [-1, 256, 64, 64]         262,144
     BatchNorm2d-329          [-1, 256, 64, 64]             512
            ReLU-330          [-1, 256, 64, 64]               0
      Conv2dAuto-331          [-1, 256, 64, 64]         589,824
     BatchNorm2d-332          [-1, 256, 64, 64]             512
            ReLU-333          [-1, 256, 64, 64]               0
      Conv2dAuto-334         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-335         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-336         [-1, 1024, 64, 64]               0
      Conv2dAuto-337          [-1, 256, 64, 64]         262,144
     BatchNorm2d-338          [-1, 256, 64, 64]             512
            ReLU-339          [-1, 256, 64, 64]               0
      Conv2dAuto-340          [-1, 256, 64, 64]         589,824
     BatchNorm2d-341          [-1, 256, 64, 64]             512
            ReLU-342          [-1, 256, 64, 64]               0
      Conv2dAuto-343         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-344         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-345         [-1, 1024, 64, 64]               0
      Conv2dAuto-346          [-1, 256, 64, 64]         262,144
     BatchNorm2d-347          [-1, 256, 64, 64]             512
            ReLU-348          [-1, 256, 64, 64]               0
      Conv2dAuto-349          [-1, 256, 64, 64]         589,824
     BatchNorm2d-350          [-1, 256, 64, 64]             512
            ReLU-351          [-1, 256, 64, 64]               0
      Conv2dAuto-352         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-353         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-354         [-1, 1024, 64, 64]               0
      Conv2dAuto-355          [-1, 256, 64, 64]         262,144
     BatchNorm2d-356          [-1, 256, 64, 64]             512
            ReLU-357          [-1, 256, 64, 64]               0
      Conv2dAuto-358          [-1, 256, 64, 64]         589,824
     BatchNorm2d-359          [-1, 256, 64, 64]             512
            ReLU-360          [-1, 256, 64, 64]               0
      Conv2dAuto-361         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-362         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-363         [-1, 1024, 64, 64]               0
      Conv2dAuto-364          [-1, 256, 64, 64]         262,144
     BatchNorm2d-365          [-1, 256, 64, 64]             512
            ReLU-366          [-1, 256, 64, 64]               0
      Conv2dAuto-367          [-1, 256, 64, 64]         589,824
     BatchNorm2d-368          [-1, 256, 64, 64]             512
            ReLU-369          [-1, 256, 64, 64]               0
      Conv2dAuto-370         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-371         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-372         [-1, 1024, 64, 64]               0
      Conv2dAuto-373          [-1, 256, 64, 64]         262,144
     BatchNorm2d-374          [-1, 256, 64, 64]             512
            ReLU-375          [-1, 256, 64, 64]               0
      Conv2dAuto-376          [-1, 256, 64, 64]         589,824
     BatchNorm2d-377          [-1, 256, 64, 64]             512
            ReLU-378          [-1, 256, 64, 64]               0
      Conv2dAuto-379         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-380         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-381         [-1, 1024, 64, 64]               0
      Conv2dAuto-382          [-1, 256, 64, 64]         262,144
     BatchNorm2d-383          [-1, 256, 64, 64]             512
            ReLU-384          [-1, 256, 64, 64]               0
      Conv2dAuto-385          [-1, 256, 64, 64]         589,824
     BatchNorm2d-386          [-1, 256, 64, 64]             512
            ReLU-387          [-1, 256, 64, 64]               0
      Conv2dAuto-388         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-389         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-390         [-1, 1024, 64, 64]               0
      Conv2dAuto-391          [-1, 256, 64, 64]         262,144
     BatchNorm2d-392          [-1, 256, 64, 64]             512
            ReLU-393          [-1, 256, 64, 64]               0
      Conv2dAuto-394          [-1, 256, 64, 64]         589,824
     BatchNorm2d-395          [-1, 256, 64, 64]             512
            ReLU-396          [-1, 256, 64, 64]               0
      Conv2dAuto-397         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-398         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-399         [-1, 1024, 64, 64]               0
      Conv2dAuto-400          [-1, 256, 64, 64]         262,144
     BatchNorm2d-401          [-1, 256, 64, 64]             512
            ReLU-402          [-1, 256, 64, 64]               0
      Conv2dAuto-403          [-1, 256, 64, 64]         589,824
     BatchNorm2d-404          [-1, 256, 64, 64]             512
            ReLU-405          [-1, 256, 64, 64]               0
      Conv2dAuto-406         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-407         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-408         [-1, 1024, 64, 64]               0
      Conv2dAuto-409          [-1, 256, 64, 64]         262,144
     BatchNorm2d-410          [-1, 256, 64, 64]             512
            ReLU-411          [-1, 256, 64, 64]               0
      Conv2dAuto-412          [-1, 256, 64, 64]         589,824
     BatchNorm2d-413          [-1, 256, 64, 64]             512
            ReLU-414          [-1, 256, 64, 64]               0
      Conv2dAuto-415         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-416         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-417         [-1, 1024, 64, 64]               0
      Conv2dAuto-418          [-1, 256, 64, 64]         262,144
     BatchNorm2d-419          [-1, 256, 64, 64]             512
            ReLU-420          [-1, 256, 64, 64]               0
      Conv2dAuto-421          [-1, 256, 64, 64]         589,824
     BatchNorm2d-422          [-1, 256, 64, 64]             512
            ReLU-423          [-1, 256, 64, 64]               0
      Conv2dAuto-424         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-425         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-426         [-1, 1024, 64, 64]               0
      Conv2dAuto-427          [-1, 256, 64, 64]         262,144
     BatchNorm2d-428          [-1, 256, 64, 64]             512
            ReLU-429          [-1, 256, 64, 64]               0
      Conv2dAuto-430          [-1, 256, 64, 64]         589,824
     BatchNorm2d-431          [-1, 256, 64, 64]             512
            ReLU-432          [-1, 256, 64, 64]               0
      Conv2dAuto-433         [-1, 1024, 64, 64]         262,144
     BatchNorm2d-434         [-1, 1024, 64, 64]           2,048
ResNetBottleNeckBlock-435         [-1, 1024, 64, 64]               0
     ResNetLayer-436         [-1, 1024, 64, 64]               0
          Conv2d-437         [-1, 2048, 32, 32]       2,097,152
     BatchNorm2d-438         [-1, 2048, 32, 32]           4,096
      Conv2dAuto-439          [-1, 512, 64, 64]         524,288
     BatchNorm2d-440          [-1, 512, 64, 64]           1,024
            ReLU-441          [-1, 512, 64, 64]               0
      Conv2dAuto-442          [-1, 512, 32, 32]       2,359,296
     BatchNorm2d-443          [-1, 512, 32, 32]           1,024
            ReLU-444          [-1, 512, 32, 32]               0
      Conv2dAuto-445         [-1, 2048, 32, 32]       1,048,576
     BatchNorm2d-446         [-1, 2048, 32, 32]           4,096
ResNetBottleNeckBlock-447         [-1, 2048, 32, 32]               0
      Conv2dAuto-448          [-1, 512, 32, 32]       1,048,576
     BatchNorm2d-449          [-1, 512, 32, 32]           1,024
            ReLU-450          [-1, 512, 32, 32]               0
      Conv2dAuto-451          [-1, 512, 32, 32]       2,359,296
     BatchNorm2d-452          [-1, 512, 32, 32]           1,024
            ReLU-453          [-1, 512, 32, 32]               0
      Conv2dAuto-454         [-1, 2048, 32, 32]       1,048,576
     BatchNorm2d-455         [-1, 2048, 32, 32]           4,096
ResNetBottleNeckBlock-456         [-1, 2048, 32, 32]               0
      Conv2dAuto-457          [-1, 512, 32, 32]       1,048,576
     BatchNorm2d-458          [-1, 512, 32, 32]           1,024
            ReLU-459          [-1, 512, 32, 32]               0
      Conv2dAuto-460          [-1, 512, 32, 32]       2,359,296
     BatchNorm2d-461          [-1, 512, 32, 32]           1,024
            ReLU-462          [-1, 512, 32, 32]               0
      Conv2dAuto-463         [-1, 2048, 32, 32]       1,048,576
     BatchNorm2d-464         [-1, 2048, 32, 32]           4,096
ResNetBottleNeckBlock-465         [-1, 2048, 32, 32]               0
     ResNetLayer-466         [-1, 2048, 32, 32]               0
   ResNetEncoder-467         [-1, 2048, 32, 32]               0
AdaptiveAvgPool2d-468           [-1, 2048, 1, 1]               0
          Linear-469                 [-1, 1000]       2,049,000
   ResnetDecoder-470                 [-1, 1000]               0
================================================================
Total params: 60,192,808
Trainable params: 60,192,808
Non-trainable params: 0
​

挺好的155层卷积层,挺厚的,而且如果某一层过拟合了能直接不要了,何凯明大佬还是非常厉害的

[001/050] 18.78 sec(s) Train Acc: 0.222500 Loss: 0.793812 | Val Acc: 0.215000 loss: 0.900603
[002/050] 16.20 sec(s) Train Acc: 0.301250 Loss: 0.470209 | Val Acc: 0.400000 loss: 0.380915
[003/050] 16.43 sec(s) Train Acc: 0.373750 Loss: 0.419760 | Val Acc: 0.430000 loss: 0.421366
[004/050] 16.21 sec(s) Train Acc: 0.457500 Loss: 0.372583 | Val Acc: 0.395000 loss: 0.577332
[005/050] 15.70 sec(s) Train Acc: 0.460000 Loss: 0.363756 | Val Acc: 0.310000 loss: 0.617506
[006/050] 16.15 sec(s) Train Acc: 0.506250 Loss: 0.335631 | Val Acc: 0.420000 loss: 0.434179
[007/050] 16.27 sec(s) Train Acc: 0.506250 Loss: 0.346168 | Val Acc: 0.395000 loss: 0.457854
[008/050] 16.17 sec(s) Train Acc: 0.523750 Loss: 0.328431 | Val Acc: 0.380000 loss: 0.587639
[009/050] 15.68 sec(s) Train Acc: 0.532500 Loss: 0.317814 | Val Acc: 0.435000 loss: 0.417199
[010/050] 15.68 sec(s) Train Acc: 0.571250 Loss: 0.301545 | Val Acc: 0.610000 loss: 0.265376
[011/050] 15.49 sec(s) Train Acc: 0.571250 Loss: 0.288428 | Val Acc: 0.525000 loss: 0.328064
[012/050] 16.02 sec(s) Train Acc: 0.588750 Loss: 0.280327 | Val Acc: 0.580000 loss: 0.373323
[013/050] 16.14 sec(s) Train Acc: 0.588750 Loss: 0.276690 | Val Acc: 0.455000 loss: 0.530960
[014/050] 15.68 sec(s) Train Acc: 0.595000 Loss: 0.278174 | Val Acc: 0.355000 loss: 0.819857
[015/050] 16.49 sec(s) Train Acc: 0.637500 Loss: 0.251591 | Val Acc: 0.535000 loss: 0.429338
[016/050] 15.93 sec(s) Train Acc: 0.617500 Loss: 0.268316 | Val Acc: 0.385000 loss: 0.915231
[017/050] 15.66 sec(s) Train Acc: 0.620000 Loss: 0.264712 | Val Acc: 0.425000 loss: 0.901076
[018/050] 15.80 sec(s) Train Acc: 0.640000 Loss: 0.250248 | Val Acc: 0.420000 loss: 0.609772
[019/050] 15.40 sec(s) Train Acc: 0.662500 Loss: 0.238282 | Val Acc: 0.450000 loss: 0.595310
[020/050] 15.48 sec(s) Train Acc: 0.653750 Loss: 0.244724 | Val Acc: 0.415000 loss: 0.842149
[021/050] 15.94 sec(s) Train Acc: 0.658750 Loss: 0.230015 | Val Acc: 0.530000 loss: 0.535357
[022/050] 16.50 sec(s) Train Acc: 0.716250 Loss: 0.203477 | Val Acc: 0.625000 loss: 0.354239
[023/050] 16.11 sec(s) Train Acc: 0.647500 Loss: 0.236142 | Val Acc: 0.545000 loss: 0.561437
[024/050] 16.05 sec(s) Train Acc: 0.683750 Loss: 0.224490 | Val Acc: 0.370000 loss: 1.728145
[025/050] 15.88 sec(s) Train Acc: 0.701250 Loss: 0.198898 | Val Acc: 0.435000 loss: 0.770039
[026/050] 15.96 sec(s) Train Acc: 0.703750 Loss: 0.206086 | Val Acc: 0.530000 loss: 0.774096
[027/050] 16.21 sec(s) Train Acc: 0.705000 Loss: 0.217260 | Val Acc: 0.410000 loss: 0.609468
[028/050] 16.18 sec(s) Train Acc: 0.727500 Loss: 0.196382 | Val Acc: 0.345000 loss: 1.229072
[029/050] 15.82 sec(s) Train Acc: 0.702500 Loss: 0.215304 | Val Acc: 0.535000 loss: 0.476039
[030/050] 15.70 sec(s) Train Acc: 0.720000 Loss: 0.187142 | Val Acc: 0.390000 loss: 2.383541
[031/050] 15.84 sec(s) Train Acc: 0.736250 Loss: 0.187775 | Val Acc: 0.555000 loss: 0.547186
[032/050] 15.60 sec(s) Train Acc: 0.743750 Loss: 0.191418 | Val Acc: 0.585000 loss: 0.760037
[033/050] 15.77 sec(s) Train Acc: 0.756250 Loss: 0.171473 | Val Acc: 0.575000 loss: 0.383314
[034/050] 15.78 sec(s) Train Acc: 0.750000 Loss: 0.174410 | Val Acc: 0.615000 loss: 0.532383
[035/050] 15.59 sec(s) Train Acc: 0.766250 Loss: 0.174796 | Val Acc: 0.575000 loss: 0.591066
[036/050] 15.81 sec(s) Train Acc: 0.753750 Loss: 0.182459 | Val Acc: 0.560000 loss: 0.855034
[037/050] 16.07 sec(s) Train Acc: 0.780000 Loss: 0.156893 | Val Acc: 0.510000 loss: 0.504993
[038/050] 15.83 sec(s) Train Acc: 0.778750 Loss: 0.166430 | Val Acc: 0.580000 loss: 0.983311
[039/050] 15.73 sec(s) Train Acc: 0.807500 Loss: 0.147398 | Val Acc: 0.665000 loss: 0.591307
[040/050] 15.59 sec(s) Train Acc: 0.793750 Loss: 0.156374 | Val Acc: 0.560000 loss: 0.794995
[041/050] 15.39 sec(s) Train Acc: 0.793750 Loss: 0.155911 | Val Acc: 0.610000 loss: 0.498716
[042/050] 15.55 sec(s) Train Acc: 0.833750 Loss: 0.133526 | Val Acc: 0.630000 loss: 0.802893
[043/050] 16.05 sec(s) Train Acc: 0.777500 Loss: 0.167620 | Val Acc: 0.490000 loss: 1.040836
[044/050] 15.66 sec(s) Train Acc: 0.817500 Loss: 0.135479 | Val Acc: 0.670000 loss: 0.363291
[045/050] 16.09 sec(s) Train Acc: 0.812500 Loss: 0.131704 | Val Acc: 0.640000 loss: 0.703357
[046/050] 16.48 sec(s) Train Acc: 0.808750 Loss: 0.146896 | Val Acc: 0.765000 loss: 0.251931
[047/050] 15.50 sec(s) Train Acc: 0.797500 Loss: 0.145349 | Val Acc: 0.665000 loss: 0.450391
[048/050] 15.88 sec(s) Train Acc: 0.810000 Loss: 0.137287 | Val Acc: 0.620000 loss: 0.803165
[049/050] 16.09 sec(s) Train Acc: 0.815000 Loss: 0.136437 | Val Acc: 0.550000 loss: 0.561041
[050/050] 15.64 sec(s) Train Acc: 0.791250 Loss: 0.152648 | Val Acc: 0.645000 loss: 0.564178

采用Resnet做UC Merced Land Use Dataset数据分类
生成的csv对照了一下80个错了23个还行还行,但可惜训练集的量我没用太好,如果能换一下,就很好,可惜换不得。
如果有需要全部代码和生成混淆矩阵啥之类的源码啊之类的问题直接私信我就行,我最近也都在

Original: https://blog.csdn.net/weixin_43730207/article/details/126336539
Author: 一个不会读文献的参考文献
Title: 采用Resnet做UC Merced Land Use Dataset数据分类

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/666601/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球