Group Equivariant Convolutional Networks

Mar 06, 2020 - 3:00 pm to 4:00 pm

Campus, PAB 232

Alex Madurowicz

Join us Friday March 6th at 3pm in PAB 232 on campus (and on zoom ) for the next meeting of the Stats and ML Journal Club. This week, Alex Madurowicz will be leading a discussion of Group Equivariant CNNs, an early attempt at incorporating abstract symmetries into NNs. See you then! 



Title: Group Equivariant Convolutional Networks

Abstract: We introduce Group equivariant Convolutional

Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST.