18/10/07 21:24:55.65 mIq+f5AO0.net
URLリンク(arxiv.org)
> 3.2 Global Average Pooling
> ・・・
> However, the fully connected layers are prone to overfitting, thus hampering the generalization ability
> of the overall network. Dropout is proposed by Hinton et al. [5] as a regularizer which randomly
> sets half of the activations to the fully connected layers to zero during training. It has improved the
> generalization ability and largely prevents overfitting [4].
> In this paper, we propose another strategy called global average pooling to replace the traditional
> fully connected layers in CNN.