TY  - JOUR
AV  - public
UR  - https://doi.org/10.3390/e26110974
SN  - 1099-4300
A1  - Lyu, Zhaoyan
A1  - Miguel R. D., Rodrigues
N2  - Deep learning has made significant strides, driving advances in areas like computer vision, natural language processing, and autonomous systems. In this paper, we further investigate the implications of the role of additive shortcut connections, focusing on models such as ResNet, Vision Transformers (ViTs), and MLP-Mixers, given that they are essential in enabling efficient information flow and mitigating optimization challenges such as vanishing gradients. In particular, capitalizing on our recent information bottleneck approach, we analyze how additive shortcuts influence the fitting and compression phases of training, crucial for generalization. We leverage Z-X and Z-Y measures as practical alternatives to mutual information for observing these dynamics in high-dimensional spaces. Our empirical results demonstrate that models with identity shortcuts (ISs) often skip the initial fitting phase and move directly into the compression phase, while non-identity shortcut (NIS) models follow the conventional two-phase process. Furthermore, we explore how IS models are still able to compress effectively, maintaining their generalization capacity despite bypassing the early fitting stages. These findings offer new insights into the dynamics of shortcut connections in neural networks, contributing to the optimization of modern deep learning architectures.
KW  - deep learning
KW  -  neural networks
KW  -  transformer
KW  -  shortcut connections
KW  -  inforamtion bottleneck theory
VL  - 26
JF  - Entropy
IS  - 11
TI  - Exploring the Impact of Additive Shortcuts in Neural Networks via Information Bottleneck-like Dynamics: From ResNet to Transformer
PB  - MDPI AG
Y1  - 2024/11/14/
ID  - discovery10200131
ER  -