Lyapunov functions
dynax.ConvexLyapunov
Convex Lyapunov normalization.
This module normalizes a convex function by ensuring it is a valid Lyapunov function suitable for showing global stability. It performs the following transformation on a given function \(f:\mathbb{R}^n \rightarrow \mathbb{R}\): $$ F(x) = f(x) - f(x^\ast) - \frac{\partial f}{\partial x} \bigg\vert_{x^\ast}\cdot (x-x^\ast) + \epsilon\left\lVert x-x^\ast \right\rVert^2. $$ This ensures the resulting function \(F\) has a unique minimum at \(x^\ast\) and is positive definite due to the is a small regularization term \(\epsilon\).
__init__(func, state_size, minimum_init=<function zeros>, minimum_learnable=False, epsilon=1e-06, dtype=None, *, key)
Initialize the Lyapunov normalization.
| PARAMETER | DESCRIPTION |
|---|---|
func
|
Convex function \(f:\mathbb{R}^n \rightarrow \mathbb{R}\)
TYPE:
|
state_size
|
State size \(n\) needed to determine the size for the minimum \(x^\ast\).
TYPE:
|
minimum_init
|
Initializer for the minimum location.
Can be any function with signature
TYPE:
|
minimum_learnable
|
If True, the minimum location \(x^\ast\) is learnable. Otherwise, its gradients are stopped, preventing updates during optimization.
TYPE:
|
epsilon
|
Small value to ensure the Lyapunov function is positive definite.
TYPE:
|
dtype
|
The dtype to use for the minimum.
Defaults to either
TYPE:
|
key
|
PRNG key for random initialization of the minimum.
TYPE:
|