Lyapunov functions
dynax.ConvexLyapunov
Convex Lyapunov normalization.
This module normalizes a convex function by ensuring it is a valid Lyapunov function suitable for showing global stability. It performs the following transformation on a given function \(f:\mathbb{R}^n \rightarrow \mathbb{R}\): $$ F(x) = f(x) - f(x_0) - \frac{\partial f}{\partial x} \bigg\vert_{x_0}\cdot (x-x_0) + \epsilon\left\lVert x-x_0 \right\rVert^2. $$ This ensures the resulting function \(F\) has a unique minimum at \(x_0\) and is positive definite due to the is a small regularization term \(\epsilon\).
__init__(func, state_size, minimum_init=<function zeros>, minimum_learnable=False, epsilon=1e-06, dtype=None, *, key)
Initialize the Lyapunov normalization.
PARAMETER | DESCRIPTION |
---|---|
func
|
Convex function \(f:\mathbb{R}^n \rightarrow \mathbb{R}\)
TYPE:
|
state_size
|
State size \(n\) needed to determine the size for the minimum \(x_0\).
TYPE:
|
minimum_init
|
Initializer for the minimum location.
Can be any function with signature
TYPE:
|
minimum_learnable
|
If True, the minimum location \(x_0\) is learnable. Otherwise, its gradients are stopped, preventing updates during optimization.
TYPE:
|
epsilon
|
Small value to ensure the Lyapunov function is positive definite.
TYPE:
|
dtype
|
The dtype to use for the minimum.
Defaults to either
TYPE:
|
key
|
PRNG key for random initialization of the minimum.
TYPE:
|