Volume-Preserving Feedforward Neural Network
Neural network architecture
The constructor produces the following architecture[1]:
Here LinearLowerLayer performs $x \mapsto x + Lx$ and NonLinearLowerLayer performs $x \mapsto x + \sigma(Lx + b)$. The activation function $\sigma$ is the forth input argument to the constructor and tanh
by default.
Note on Sympnets
As SympNets are symplectic maps, they also conserve phase space volume and therefore form a subcategory of volume-preserving feedforward layers.
Library Functions
GeometricMachineLearning.VolumePreservingFeedForward
— TypeRealizes a volume-preserving neural network as a combination of VolumePreservingLowerLayer
and VolumePreservingUpperLayer
.
Constructor
The constructor is called with the following arguments:
sys_dim::Int
: The system dimension.n_blocks::Int
: The number of blocks in the neural network (containing linear layers and nonlinear layers). Default is1
.n_linear::Int
: The number of linearVolumePreservingLowerLayer
s andVolumePreservingUpperLayer
s in one block. Default is1
.activation
: The activation function for the nonlinear layers in a block.init_upper::Bool=false
(keyword argument): Specifies if the first layer is lower or upper.
- 1Based on the input arguments
n_linear
andn_blocks
. In this exampleinit_upper
is set to false, which means that the first layer is of type lower followed by a layer of type upper.