Our recent work on analysing a set of permutation invariant neural network architectures is probably on the theoretical end of the spectrum of the type of work we do at the A2I lab. Nevertheless it is equally exciting as it has concrete implications for real-world robotics such as working with point clouds from Lidars. We encourage reading either, or both, of the following:
Abstract Recent work on the representation of functions on sets has considered the use of summation in a latent space to enforce permutation invariance. In particular, it has been conjectured that the dimension of this latent space may remain fixed as the cardinality of the sets under consideration increases. However, we demonstrate that the analysis leading to this conjecture requires mappings which are highly discontinuous and argue that this is only of limited practical use. Motivated by this observation, we prove that an implementation of this model via continuous mappings (as provided by e.g. neural networks or Gaussian processes) actually imposes a constraint on the dimensionality of the latent space. Practical universal function representation for set inputs can only be achieved with a latent dimension at least the size of the maximum number of input elements.
The figure below shows the DeepSets from Zaheer et al. we are analysing: