Volume-Preserving Feedforward Neural Network

Neural network architecture

The constructor produces the following architecture[1]:

Here LinearLowerLayer performs $x \mapsto x + Lx$ and NonLinearLowerLayer performs $x \mapsto x + \sigma(Lx + b)$. The activation function $\sigma$ is the forth input argument to the constructor and tanh by default.

Note on Sympnets

As SympNets are symplectic maps, they also conserve phase space volume and therefore form a subcategory of volume-preserving feedforward layers.

Library Functions

GeometricMachineLearning.VolumePreservingFeedForwardType

Realizes a volume-preserving neural network as a combination of VolumePreservingLowerLayer and VolumePreservingUpperLayer.

Constructor

The constructor is called with the following arguments:

  • sys_dim::Int: The system dimension.
  • n_blocks::Int: The number of blocks in the neural network (containing linear layers and nonlinear layers). Default is 1.
  • n_linear::Int: The number of linear VolumePreservingLowerLayers and VolumePreservingUpperLayers in one block. Default is 1.
  • activation: The activation function for the nonlinear layers in a block.
  • init_upper::Bool=false (keyword argument): Specifies if the first layer is lower or upper.
source
  • 1Based on the input arguments n_linear and n_blocks. In this example init_upper is set to false, which means that the first layer is of type lower followed by a layer of type upper.