Shared Representational Geometry Across Neural Networks
Author(s): Lu, Qihong; Chen, Po-Hsuan; Pillow, Jonathan W.; Ramadge, Peter J.; Norman, Kenneth A.; et al
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1ct8x
Abstract: | Different neural networks trained on the same dataset often learn similar input-output mappings with very different weights. Is there some correspondence between these neural network solutions? For linear networks, it has been shown that different instances of the same network architecture encode the same representational similarity matrix, and their neural activity patterns are connected by orthogonal transformations. However, it is unclear if this holds for non-linear networks. Using a shared response model, we show that different neural networks encode the same input examples as different orthogonal transformations of an underlying shared representation. We test this claim using both standard convolutional neural networks and residual networks on CIFAR10 and CIFAR100. |
Publication Date: | 3-Apr-2020 |
Citation: | Lu, Qihong, Chen, Po-Hsuan, Pillow, Jonathan W, Ramadge, Peter J, Norman, Kenneth A, Hasson, Uri. (Shared Representational Geometry Across Neural Networks |
Type of Material: | Journal Article |
Journal/Proceeding Title: | 32nd Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada |
Version: | Author's manuscript |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.