module: statespace

This module provides methods to define autonomous linear state space models (ALSSMs) to define squared error cost functions based on such ALSSMs. ALSSMs are input-free linear state space models (LSSMs), i.e., the model outputs are fully defined by a single state vector, often denoted as initial state vector x. The output vector of such a deterministic model forms the signal model used. Cost functions based on ALSSM are internally efficiently computed using recursive computation rules.

For module implements the concepts published in [Wildhaber2018] PDF, with extensions from [Zalmai2017] and [Wildhaber2019].

ALSSM Classes

Introductory Example

>>> import lmlib as lm
>>>
>>> # defining an ALSSM of a rotation matrix and evaluating its values over time
>>> omega = 3.14159 / 4  # sinusoidal frequency
>>> alssm = lm.AlssmSin(omega, label='MySin')  # setting-up ALSSM model
>>> alssm.dump_tree()  # dumping ALSSM matrices dimensions and properties
└-Alssm : sinusoidal, A: (2, 2), C: (1, 2), label: MySin

>>> print(alssm)  # displaying internal ALSSM transition matrix A and output matrix C
A =
[[ 0.70710725 -0.70710631]
 [ 0.70710631  0.70710725]]
C =
[[1 0]]
>>> x = [1, 0]  # initial state vector
>>> js = range(5)  # (time) indeces of ALSSm evaluation
>>> print(alssm.eval(x, js))  # evaluation and printing ALSSM output sequence
[[ 1.00000000e+00]
 [ 7.07107250e-01]
 [ 1.32679490e-06]
 [-7.07105374e-01]
 [-1.00000000e+00]]

Class List

Alssm

Generic (Native) Autonomous Linear State Space Model (ALSSM)

AlssmPoly

ALSSM with discrete-time polynomial output sequence

AlssmPolyJordan

ALSSM with discrete-time polynomial output sequence, in Jorandian normal form

AlssmSin

ALSSM with discrete-time (damped) sinusoidal output sequence.

AlssmExp

ALSSM with discrete-time exponentially decaying/increasing output sequence.

AlssmStacked

Creates a joined ALSSM generating a stacked output signal of multiple ALSSMs.

AlssmStackedSO

Stacking multiple ALSSMs, generating summed output vector of ALSSMs.

AlssmProd

Joins multiple ALSSMs generating the output product.


Cost Function Classes

Introductory Example

Example of a linear regression estimation by minimizing a squared error using an alssm line model with a rectangular window.

>>> import lmlib as lm
>>> from lmlib.utils.generator import *
>>>
>>> K = 100
>>> y = 0.1 * gen_slope(K, K) + gen_wgn(K, sigma=0.1)  # test signal generator (ramp of slope 0.1 with additive white Gaussian noise)
>>>
>>> alssm = lm.AlssmPoly(poly_degree=1)  # creating straight line model (=polynomial of degree 1)
>>> segment = lm.Segment(a=0, b=100, direction=lm.FORWARD, g=1e6)  # defining interval (segment) and window (~rectangular window)
>>> cseg = lm.CostSegment(alssm, segment)  # assigning the windowed segment to the alssm, leading to a cost function
>>> param = lm.SEParam()  # initializing data container for results (estimates)
>>> param.filter(cseg, y)  # applying recursive squared error filter
>>> xs = param.minimize_x()  # minimizing squared error cost to get optimal initial states of ALSSM
>>>
>>> ks = [10, 20]  # time indices to evaluate ALSSM (i.e., regression line) at
>>> print(xs[ks])  # output initial state at index 0 => state estimate  [offset, slope]
[[0.00903193 0.00129693]
 [0.04238851 0.00089603]]
>>> Js = param.eval_errors(xs[ks], ks)  # squared error of entire straight line fit
>>> print(Js)
[0.81865598 0.71205991]

Function List

map_window

Maps the window amplitudes of one or multiple Segment to indices ks into a common target output vector of length K.

map_trajectories

Maps trajectories at indices ks into a common target output vector of length K.

map_values

Maps values from to multi-dimensional input array y to the indices ks in the multi-dimensional output array.

Class List

Segment

Segment represents a window of finite or infinite interval borders used to select and weight signal samples in a cost function.

CostSegment

Quadratic cost function defined by an ALSSM and a Segment.

CompositeCost

Quadratic cost function defined by a conjunction one or multiple of ALSSM and Segment

ConstrainMatrix

Builder class to set up matrix H as a linear contraint for the squared error minimization

SEParam

Data container for squared error filter’s output

API (module: statespace)

Constants

FORWARD = 'fw'

Sets the recursion direction in a Segment to forward, use lm.FORWARD

Type

str

BACKWARD = 'bw'

Sets the recursion direction in a Segment to backward, use lm.BACKWARD

Type

str

Classes

class Alssm(A, C, label=None)

Bases: lmlib.statespace.models._ModelBase

Generic (Native) Autonomous Linear State Space Model (ALSSM)

This class holds the parameters of a discrete-time, autonomous (i.e., input-free), single- or multi-output linear state space model, defined recursively by

\[ \begin{align}\begin{aligned}x[k+1] &= Ax[k]\\s_k(x) = y[k] &= Cx[k],\end{aligned}\end{align} \]

where \(A \in \mathbb{R}^{N\times N}, C \in \mathbb{R}^{L \times N}\) are the fixed model parameters (matrices), \(k\) the time index, \(y[k] \in \mathbb{R}^{L \times 1}\) the output vector, and \(x[k] \in \mathbb{R}^{N}\) the state vector.

For more details, see also [Wildhaber2019] [Eq. 4.1].

Parameters
  • A (array_like, shape=(N, N)) – State Matrix

  • C (array_like, shape=(L, N)) – Output Matrix

  • label (str, optional) – Model label (default: label=None)

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels

Examples

>>> A = [[1, 1], [0, 1]]
>>> C = [[1, 0]]
>>> alssm = lm.Alssm(A, C, label='line')
>>> print(alssm)
A =
[[1 1]
 [0 1]]
C =
[[1 0]]
>>> alssm
Alssm : native, A: (2, 2), C: (1, 2), label: line

Attributes

Alssm.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

Alssm.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

Alssm.N

Model order \(N\)

Alssm.L

Output order \(L\)

Alssm.label

Label of the model

Alssm.output_count

Number of outputs

Alssm.state_variable_count

Number of state variables

Alssm.alssm_type_str

String representing the current class type

Alssm.state_variable_labels

Dictionary containing state variable labels and index

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the structure of the model as a string.

set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector.

eval(xs, js=None)

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js. js is either an array of indeces or a range() object.

For js=None (default), eval(...) returns the ALSSM output

\[y_0 = s_0(x) = CA^0x = Cx\]

for each state vector \(x\) from the array xs.

For js != None, eval(...) returns the ALSSM output

\[y_i = s_j(x) = CA^jx\]

for each \(j\) in the list js and each state vector \(x\) from the array xs.

Parameters
  • xs (array_like of shape=(XS,N[,S])) – List of length XS with state vectors \(x\)

  • js (array_like of shape=(J,), range(), None, optional) – ALSSM evaluation indices

Returns

ys (ndarray of shape=(XS,[J,]L[,S])) – ALSSM outputs

Note

xs is always expecting a list of states, also when evaluating for a single state, i.e., when evaluating a single state xs = [ [.1, 1.0] ] ys is always returns a multi-channel signal, also in a single-channel application, e.g., ` ys = [ [.1], [2], [.3] ]`. Also check the examples below.

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels
S : number of signal sets
j : ALSSM evaluation index
J : number of ALSSM evaluation indices

Examples

>>> A = [[1, 1], [0, 1]]
>>> C = [[1, 0]]
>>> alssm = lm.Alssm(A, C, label='line')
>>>
>>> xs = [[0.1, 3], [0, 1], [-0.8, 0.2], [1, -3]]
>>> s = alssm.eval(xs)
>>> print(s)
[[ 0.1]
 [ 0. ]
 [-0.8]
 [ 1. ]]
>>> y = alssm.eval(xs, js=[0, 1])
>>> print(y)
[[[ 0.1]
  [ 3.1]]
 [[ 0. ]
  [ 1. ]]
 [[-0.8]
  [-0.6]]
 [[ 1. ]
  [-2. ]]]
>>> y = alssm.eval([[0.1, 3]], js=range(-2,2))
>>> print(y)
[[[-5.9]
  [-2.9]
  [ 0.1]
  [ 3.1]]
dump_tree(level=0)

Returns the structure of the model as a string.

Parameters

level (int) – String-indent level (used for recursive calls in nested LSSMs) (default: level = 0)

Returns

out (str) – String representing internal model structure.

Examples

>>> alssm = lm.AlssmPoly(poly_degree=2, label='slope_with_offset')
>>> print(alssm.dump_tree())
└-Alssm : polynomial, A: (3, 3), C: (1, 3), label: slope_with_offset
set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector. Such labels are used to quickly referece to single states in the state vector by its names.

Parameters
  • name (str) – Label name

  • indices (tuple) – Tuple with index/indices

Examples

>>> alssm = lm.AlssmPoly(poly_degree=1, label='slope_with_offset')
>>> alssm.set_state_variable_label('slope', (1,))
>>> alssm.state_variable_labels
{'x': range(0, 2), 'x0': (0,), 'x1': (1,), 'slope': (1,)}
>>> alssm.state_variable_labels['slope']
(1,)
class AlssmPoly(poly_degree, C=None, label=None)

Bases: lmlib.statespace.models.Alssm

ALSSM with discrete-time polynomial output sequence

Representation of a Q-th degree polynomial in i of the form

\[P_Q(i) = x_0 i^0 + x_1 i^1 + ... + x_Q i^Q\]

as an ALSSM, with transition matrix

\[\begin{split}A = \begin{bmatrix} 1 & 1 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 1 \end{bmatrix}\end{split}\]

and state vector

\[\begin{split}x = \begin{bmatrix} x_0 & x_1 & ... & x_N \\ \end{bmatrix}^T\end{split}\]

For more details see [Wildhaber2019] PDF

Parameters
  • poly_degree (int) – Polynomial degree Q. Corresponds to the highest exponent of the polynomial. It follows a ALSSM system of order N = Q+1.

  • C (array_like, shape=(L, N), optional) – Output Matrix. If no output matrix is given, C gets initialize automatically to [[1, 0, …]] such that the shape is (1, N). (default: C=None)

  • label (str, optional) – Model label (default: label=None)

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels

Examples

Setting up a 4. order polynomial, autonomous linear state space model.

>>> poly_degree = 3
>>> alssm = lm.AlssmPoly(poly_degree, label='poly')
>>> print(alssm)
A =
[[1 1 1 1]
 [0 1 2 3]
 [0 0 1 3]
 [0 0 0 1]]
C =
[[1 0 0 0]]
>>> C = [[1, 0, 0], [0, 1, 0]]
>>> alssm = lm.AlssmPoly(poly_degree=2, C=C, label='poly')
>>> print(alssm)
A =
[[1 1 1]
 [0 1 2]
 [0 0 1]]
C =
[[1 0 0]
 [0 1 0]]

Attributes

AlssmPoly.poly_degree

Polynomial degree \(Q\) (highest exponent/ order - 1)

AlssmPoly.alssm_type_str

String representing the current class type

AlssmPoly.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmPoly.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmPoly.N

Model order \(N\)

AlssmPoly.L

Output order \(L\)

AlssmPoly.label

Label of the model

AlssmPoly.output_count

Number of outputs

AlssmPoly.state_variable_count

Number of state variables

AlssmPoly.state_variable_labels

Dictionary containing state variable labels and index

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the structure of the model as a string.

set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector.

class AlssmPolyJordan(poly_degree, C=None, label=None)

Bases: lmlib.statespace.models.Alssm

ALSSM with discrete-time polynomial output sequence, in Jorandian normal form

Discrete-time polynomial with ALSSM with transition matrix in Jordanian normal form, see [Zalmai2017] PDF.

\[\begin{split}A = \begin{bmatrix} 1 & 1 & 0 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \end{bmatrix}\end{split}\]
Parameters
  • poly_degree (int) – Polynomial degree. Corresponds to the highest exponent of the polynomial. N = poly_degree+1.

  • C (array_like, shape=(L, N), optional) – Output Matrix. If C is not set, C is initialized to [[…, 0, 1]] of shape (1, N). (default: C=None)

  • label (str, optional) – Model label (default: label=None)

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels

Examples

Setting up a 3th degree polynomial ALSSM.

>>> poly_degree = 3
>>> alssm = lm.AlssmPolyJordan(poly_degree, label='poly')
>>> print(alssm)
A =
[[1. 1. 0. 0.]
 [0. 1. 1. 0.]
 [0. 0. 1. 1.]
 [0. 0. 0. 1.]]
C =
[[1 0 0 0]]

Attributes

AlssmPolyJordan.poly_degree

Polynomial degree \(Q\) (highest exponent/ order - 1)

AlssmPolyJordan.alssm_type_str

String representing the current class type

AlssmPolyJordan.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmPolyJordan.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmPolyJordan.N

Model order \(N\)

AlssmPolyJordan.L

Output order \(L\)

AlssmPolyJordan.label

Label of the model

AlssmPolyJordan.output_count

Number of outputs

AlssmPolyJordan.state_variable_count

Number of state variables

AlssmPolyJordan.state_variable_labels

Dictionary containing state variable labels and index

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the structure of the model as a string.

set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector.

class AlssmSin(omega, rho=1.0, C=None, label=None)

Bases: lmlib.statespace.models.Alssm

ALSSM with discrete-time (damped) sinusoidal output sequence.

The class AlssmSin is defined by a decay factor rho and discrete-time frequency omega.

\[\begin{split}A = \begin{bmatrix} \rho \cos{\omega} & -\rho \sin{\omega} \\ \rho \sin{\omega} & \rho \cos{\omega} \end{bmatrix}\end{split}\]

For more details see [Wildhaber2019] PDF

omegafloat, int

Frequency \(\omega = 2\pi f_s\)

rhofloat, int, optional

Decay factor, default: rho = 1.0

Carray_like, shape=(L, N), optional

Output Matrix. If no output matrix is given, C gets initialize automatically to [[1, 0]]. (default: C=None)

labelstr, optional

Model (default: label=None)

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels

To convert a continuous-time frequency to a normalized frequency, see k_period_to_omega(), e.g.,

>>> from lmlib.utils.generator import k_period_to_omega
>>> alssm = lm.AlssmSin(k_period_to_omega(k_period), rho)

Parametrization of a sinusoidal ALSSM:

>>> omega = 0.1
>>> rho = 0.9
>>> alssm = lm.AlssmSin(omega, rho)
>>> print(alssm)
A =
[[ 0.89550375 -0.08985007]
 [ 0.08985007  0.89550375]]
C =
[[1 0]]

Attributes

AlssmSin.omega

Frequency factor \(\omega = 2\pi f_s\)

AlssmSin.rho

Decay factor \(\rho\)

AlssmSin.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmSin.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmSin.N

Model order \(N\)

AlssmSin.L

Output order \(L\)

AlssmSin.label

Label of the model

AlssmSin.output_count

Number of outputs

AlssmSin.state_variable_count

Number of state variables

AlssmSin.state_variable_labels

Dictionary containing state variable labels and index

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the structure of the model as a string.

set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector.

class AlssmExp(gamma, C=None, label=None)

Bases: lmlib.statespace.models.Alssm

ALSSM with discrete-time exponentially decaying/increasing output sequence.

Discrete-time linear state space model generating output sequences of exponentially decaying shape with decay factor \(\gamma\).

For more details see [Wildhaber2019] PDF

gammafloat, int

Decay factor per sample step ( > 1.0 : left-sided decaying; < 1.0 : right-sided decaying)

Carray_like, shape=(L, N), optional

Output Matrix. If no output matrix is given, C gets initialize automatically to [[1]]. (default: C=None)

labelstr, optional

Model (default: label=None)

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels

Parametrizing an exponentially ALSSM:

>>> gamma = 0.8
>>> alssm = lm.AlssmExp(gamma)
>>> print(alssm)
A =
[[0.8]]
C =
[[1.]]

Attributes

AlssmExp.gamma

Decay factor per sample \(\gamma\)

AlssmExp.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmExp.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmExp.N

Model order \(N\)

AlssmExp.L

Output order \(L\)

AlssmExp.label

Label of the model

AlssmExp.output_count

Number of outputs

AlssmExp.state_variable_count

Number of state variables

AlssmExp.state_variable_labels

Dictionary containing state variable labels and index

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the structure of the model as a string.

set_state_variable_label(name, indices)

Adds an additional label for one or multiple state variabels in the state vector.

class AlssmStacked(alssms, deltas=None, label=None)

Bases: lmlib.statespace.models._ModelContainer, lmlib.statespace.models._ModelBase

Creates a joined ALSSM generating a stacked output signal of multiple ALSSMs.

For \(M\) ALSSMs as in Alssm, we get the stacked model’s output \(\tilde{s}_k(\tilde{x}) = \tilde{y}_k\)

\[\begin{split}\tilde{s}_k(\tilde{x}) = \begin{bmatrix} s_k^{(1)}(x_1) \\ \vdots \\ s_k^{(M)}(x_M) \\ \end{bmatrix} = \begin{bmatrix} y_1[k] \\ \vdots \\ y_M[k] \\ \end{bmatrix} = \tilde{y}[k] = \tilde{c} \tilde{A}^k \tilde{x} \ \ ,\end{split}\]

where \(y_m[k]\) denotes the output of the m-th joined ALSSM. \(y_m[k]\) is either a scalar (for signle-channel ALSSMs) or a column vector (for multi-channel ALSSMs). Accordingly, the initial state vector is

\[\begin{split}\tilde{x} = \begin{bmatrix} x_1 \\ \vdots \\ x_M \\ \end{bmatrix}\end{split}\]

where \(x_m[k]\) is the state vector of the m-th joined ALSSM.

Therefore, the internal model matrices of the joined model are set to the block diagonals

\[ \begin{align}\begin{aligned}\begin{split}\tilde{A} = \left[\begin{array}{c|c|c} A_1 & 0 & 0 \\ \hline 0 & \ddots & 0 \\ \hline 0 & 0 & A_{M} \end{array}\right]\end{split}\\\begin{split}\tilde{C} = \left[\begin{array}{c|c|c} \lambda_1 C_1 & 0 & 0 \\ \hline 0 & \ddots & 0 \\ \hline 0 & 0 & \lambda_M C_{M} \end{array}\right]\end{split}\end{aligned}\end{align} \]

where \(A_m\) and \(C_m\) are the transition matrices and the output vectors of the joined models, respectively, and \(\lambda_1 ... \lambda_M \in \mathcal{R}\) are additional factors to weight each output individually.

For more details see [Wildhaber2019] PDF Sec: Linear Combination of M Systems

alssmstuple of shape(M) of ALSSMs

Set of M autonomous linear state space models

deltas: list of shape=(M) of floats, optional

List of M scalar factors for each output matrix of the ALSSM in alssms. (default: deltas = None, i.e., all scalars are set to 1)

labelstr, optional

Model and Container label (default: label=None)

M : number of ALSSMs

>>> alssm_poly = lm.AlssmPoly(4, label="high order polynomial")
>>> A = [[1, 1], [0, 1]]
>>> C = [[1, 0]]
>>> alssm_line = lm.Alssm(A, C, label="line")
>>> stacked_alssm = lm.AlssmStacked((alssm_poly, alssm_line), label='stacked model')
>>> print(stacked_alssm)
A =
[[1. 1. 1. 1. 1. 0. 0.]
 [0. 1. 2. 3. 4. 0. 0.]
 [0. 0. 1. 3. 6. 0. 0.]
 [0. 0. 0. 1. 4. 0. 0.]
 [0. 0. 0. 0. 1. 0. 0.]
 [0. 0. 0. 0. 0. 1. 1.]
 [0. 0. 0. 0. 0. 0. 1.]]
C =
[[1. 0. 0. 0. 0. 0. 0.]
 [0. 0. 0. 0. 0. 1. 0.]]

Attributes

AlssmStacked.alssms

Set of ALSSM

AlssmStacked.deltas

Output scaling factors for each ALSSM in alssms

AlssmStacked.G

Output scaling factors (Abbreviation)

AlssmStacked.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmStacked.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmStacked.N

Model order \(N\)

AlssmStacked.L

Output order \(L\)

AlssmStacked.label

Label of the model

AlssmStacked.output_count

Number of outputs

AlssmStacked.state_variable_count

Number of state variables

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the internal structure of the ALSSM model as a string.

dump_tree(level=0)

Returns the internal structure of the ALSSM model as a string.

Parameters

level (int) – String-indent level (for internal use only, used for recursive calls in nested LSSMs. Default value: 0)

Returns

out (str) – String representing internal model structure.

Examples

>>> alssm_poly = lm.AlssmPoly(4, label="high order polynomial")
>>> A = [[1, 1], [0, 1]]
>>> C = [[1, 0]]
>>> alssm_line = lm.Alssm(A, C, label="line")
>>> stacked_alssm = lm.AlssmStacked((alssm_poly, alssm_line), label='stacked model')
>>> print(stacked_alssm.dump_tree())
└-Alssm : stacked, A: (7, 7), C: (2, 7), label: stacked model
  └-Alssm : polynomial, A: (5, 5), C: (1, 5), label: high order polynomial
  └-Alssm : native, A: (2, 2), C: (1, 2), label: line
class AlssmStackedSO(alssms, deltas=None, label=None)

Bases: lmlib.statespace.models._ModelContainer, lmlib.statespace.models._ModelBase

Stacking multiple ALSSMs, generating summed output vector of ALSSMs.

Creating a joined ALSSM generating the summed output signal of \(M\) ALSSMs,

\[\tilde{s}_k(\tilde{x}) = s_k^{(1)}(x_1) + ... + s_k^{(M)}(x_M) = y_1[k] + ... + y_M[k] = \tilde{y}_k = \tilde{c} \tilde{A}^k \tilde{x}\]

Therefore, the internal model matrices of the generated model are

\[ \begin{align}\begin{aligned}\begin{split}\tilde{A} = \left[\begin{array}{c|c|c} A_1 & 0 & 0 \\ \hline 0 & \ddots & 0 \\ \hline 0 & 0 & A_{M} \end{array}\right]\end{split}\\\tilde{C} = \left[\begin{array}{c|c|c} \lambda_1 C_1 & ... & \lambda_M C_M \end{array}\right]\end{aligned}\end{align} \]
Parameters
  • alssms (tuple of shape=(M) of Alssm) – Set of M autonomous linear state space models. All ALSSMs need to have the same number of output channels, see Alssm.output_count()

  • deltas (list of shape=(M) of floats, optional) – List of M scalar factors for each output matrix of the ALSSM in alssms. (default: deltas = None, i.e., all scalars are set to 1)

  • label (str, optional) – Model and Container label (default: label=None)

M : number of ALSSMs

Examples

>>> alssm_poly = lm.AlssmPoly(poly_degree=3)
>>> A = [[1, 1], [0, 1]]
>>> C = [[1, 0]]
>>> alssm_line = lm.Alssm(A, C)
>>> stacked_alssm = lm.AlssmStackedSO((alssm_poly, alssm_line))
>>> print(stacked_alssm)
A =
[[1. 1. 1. 1. 0. 0.]
 [0. 1. 2. 3. 0. 0.]
 [0. 0. 1. 3. 0. 0.]
 [0. 0. 0. 1. 0. 0.]
 [0. 0. 0. 0. 1. 1.]
 [0. 0. 0. 0. 0. 1.]]
C =
[[1. 0. 0. 0. 1. 0.]]

Attributes

AlssmStackedSO.alssms

Set of ALSSM

AlssmStackedSO.deltas

Output scaling factors for each ALSSM in alssms

AlssmStackedSO.G

Output scaling factors (Abbreviation)

AlssmStackedSO.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmStackedSO.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmStackedSO.N

Model order \(N\)

AlssmStackedSO.L

Output order \(L\)

AlssmStackedSO.label

Label of the model

AlssmStackedSO.output_count

Number of outputs

AlssmStackedSO.state_variable_count

Number of state variables

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the internal structure of the ALSSM model as a string.

class AlssmProd(alssms, deltas=None, label=None)

Bases: lmlib.statespace.models._ModelContainer, lmlib.statespace.models._ModelBase

Joins multiple ALSSMs generating the output product.

Creating a joined ALSSM generating the product of all output signals of \(M\) ALSSMs, e.g., on the example of \(M=2\),

\[\begin{split}\tilde{s}_k (\tilde{x}) = s_k^{(1)}(x_1) \cdot s_k^{(2)}(x_2) &= (C_1 A_1^k x_1)(C_2 A_2^k x_2)\\ &= (C_1 A_1^k x_1) \otimes (C_2 A_2^k x_2)\\ &= (C_1 \otimes C_2) (A_1^k \otimes A_2^k) (x_1 \otimes x_2) \ ,\end{split}\]

where \(s_k^{(1)}(x_1) = C_1 A_1^k x_1\) is the first and \(s_k^{(2)}(x_2) = C_2 A_2^k x_2\) the second ALSSM.

For more details, see also [Zalmai2017] [Proposition 2], [Wildhaber2019] [Eq. 4.21].

Parameters
  • alssms (tuple of shape=(M) of Alssm) – Set of M autonomous linear state space models. All ALSSMs need to have the same number of output channels, see Alssm.output_count()

  • deltas (list of shape=(M) of floats, optional) – List of M scalar factors for each output matrix of the ALSSM in alssms. (default: deltas = None, i.e., all scalars are set to 1)

  • label (str, optional) – Model and Container label (default: label=None)

M : number of ALSSMs

Examples

Multiply two ALSSMs

>>> alssm_p = lm.AlssmPoly(poly_degree=2, label='poly')
>>> alssm_s = lm.AlssmSin(omega=0.5, rho=0.2, label='sin')
>>> alssm = lm.AlssmProd((alssm_s, alssm_p), label="multi")
>>> print(alssm)
A =
[[ 0.17551651  0.17551651  0.17551651 -0.09588511 -0.09588511 -0.09588511]
 [ 0.          0.17551651  0.35103302 -0.         -0.09588511 -0.19177022]
 [ 0.          0.          0.17551651 -0.         -0.         -0.09588511]
 [ 0.09588511  0.09588511  0.09588511  0.17551651  0.17551651  0.17551651]
 [ 0.          0.09588511  0.19177022  0.          0.17551651  0.35103302]
 [ 0.          0.          0.09588511  0.          0.          0.17551651]]
C =
[[1. 0. 0. 0. 0. 0.]]

Attributes

AlssmProd.alssms

Set of ALSSM

AlssmProd.deltas

Output scaling factors for each ALSSM in alssms

AlssmProd.G

Output scaling factors (Abbreviation)

AlssmProd.A

State matrix \(A \in \mathbb{R}^{N \times N}\)

AlssmProd.C

Output matrix \(C \in \mathbb{R}^{L \times N}\)

AlssmProd.N

Model order \(N\)

AlssmProd.L

Output order \(L\)

AlssmProd.label

Label of the model

AlssmProd.output_count

Number of outputs

AlssmProd.state_variable_count

Number of state variables

Methods

eval(xs[, js])

Evaluation of the ALSSM for an array state vectors xs at evaluation index or indeces js.

dump_tree([level])

Returns the internal structure of the ALSSM model as a string.

map_window(window, ks, K, merge_ks=False, merge_seg=False, fill_value=0)

Maps the window amplitudes of one or multiple Segment to indices ks into a common target output vector of length K.

The parameter window is commonly directly the output of one of the following methods:

Parameters
  • window

    • tuple (range, tuple), as output by CostSegment.window().

    • tuple of shape=(P) of tuples (range, tuple), as output by CompositeCost.window().

  • ks (array_like of shape=(KS) of ints) – target indices in the target output vector, where to map the windows to

  • K (int) – Length of target output vector

  • merge_ks (bool, optional) – If True, all windows from different target indices ks are merged into a single array of length K; if the windows of two target indices overlap, only the window with the larger value remains.

  • merge_seg (bool, optional) – If True, all segments P are merged to a single target vector of length K; if the windows of two segments overlap, only window value with the larger value remains.

  • fill_value (scalar or None, optional) – Default value of target output vector elements, when no window is assigned.

Returns

output (ndarray of shape=([KS,] [P,] K) of floats) –

  • Dimension KS only exists, if merge_ks=False.

  • Dimension P only exists, if merge_seg=False.

KS : number of (time) indices in the list
P : number of segments
K : number of samples

Examples

import lmlib as lm
import matplotlib.pyplot as plt

costsegment = lm.CostSegment(lm.AlssmPoly(2), lm.Segment(-40, 0, direction=lm.FORWARD, g=20))

K = 300
ks = [50, 200]

windows = lm.map_window(costsegment.window(), ks, K)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(cost_segment, ...)')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(Source code, png, hires.png, pdf)

../_images/map_window_plot_00_00.png
windows = lm.map_window(costsegment.window(), ks, K, merge_ks=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(cost_segment, ..., merge_k=True)')
plt.plot(range(K), windows)
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_01_00.png
seg_l = lm.Segment(-40, 0, direction=lm.FORWARD, g=100)
seg_r = lm.Segment(0, 40, direction=lm.BACKWARD, g=100)
compositecost = lm.CompositeCost((lm.AlssmPoly(2),), (seg_l, seg_r), F=[[1, 1]])


windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ...)')
plt.plot(range(K), windows[0][0])
plt.plot(range(K), windows[0][1])
plt.plot(range(K), windows[1][0])
plt.plot(range(K), windows[1][1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_02_00.png
windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K, merge_ks=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ..., merge_k=True))')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_03_00.png
windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K, merge_seg=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ..., merge_seg=True)')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_04_00.png
map_trajectories(trajectory, ks, K, merge_ks=False, merge_seg=False, fill_value=nan)

Maps trajectories at indices ks into a common target output vector of length K.

The parameter trajectory is commonly directly the output of one of the following methods:

  • CostSegment.trajectory()

  • CompositeCost.trajectory()

Parameters
  • trajectory

    • list of shape=(XS) of tuples (range, tuple),

      such tuples are the output of CostSegment.trajectory().

    • list of shape=(XS) of tuples of shape=(P) of tuple (range, tuple),

      such tuples are the output of CompositeCost.trajectory().

  • ks (array_like of shape=(XS) of ints) – target indices in the target output vector, where to map the windows to

  • K (int) – Length of target vector

  • merge_ks (bool, optional) – If True, all trajectories from different target indices ks are merged into a single array of length K; if the trajectories of two target indices overlap, only trajectory value with the larger CostSegment window value remains.

  • merge_seg (bool, optional) – If True, all segments P are merged to a single target vector of length K; if the trajectories of two segments overlap, only trajectory value with the larger CostSegment window value remains.

  • fill_value (scalar, optional) – Default value of target output vector elements, when no trajectory is assigned.

Returns

output (ndarray of shape=shape of ([XS,] [P,] K, L [,S]) of floats) –

  • Dimension XS only exists, if merge_ks=False.

  • Dimension P only exists, if merge_seg=False

  • Dimension S only exists, if the parmeter xs of CompositeCost.trajectory() or CompositeCost.trajectory() also provides dimension S (i.e., provides multiple signal set processing).

XS : number of state vectors in a list
P : number of segments
K : number of samples
L : output order / number of signal channels
S : number of signal sets

Examples

import lmlib as lm
import matplotlib.pyplot as plt

costsegment = lm.CostSegment(lm.AlssmPoly(2), lm.Segment(-40, 0, direction=lm.FORWARD, g=20))

K = 300
ks = [50, 200]

windows = lm.map_window(costsegment.window(), ks, K)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(cost_segment, ...)')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(Source code, png, hires.png, pdf)

../_images/map_window_plot_00_00.png
windows = lm.map_window(costsegment.window(), ks, K, merge_ks=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(cost_segment, ..., merge_k=True)')
plt.plot(range(K), windows)
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_01_00.png
seg_l = lm.Segment(-40, 0, direction=lm.FORWARD, g=100)
seg_r = lm.Segment(0, 40, direction=lm.BACKWARD, g=100)
compositecost = lm.CompositeCost((lm.AlssmPoly(2),), (seg_l, seg_r), F=[[1, 1]])


windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ...)')
plt.plot(range(K), windows[0][0])
plt.plot(range(K), windows[0][1])
plt.plot(range(K), windows[1][0])
plt.plot(range(K), windows[1][1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_02_00.png
windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K, merge_ks=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ..., merge_k=True))')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_03_00.png
windows = lm.map_window(compositecost.window(segment_selection=[1, 1]), ks, K, merge_seg=True)
plt.figure(figsize=(6,1.5), tight_layout = True)
plt.title('map_window(composite_cost, ..., merge_seg=True)')
plt.plot(range(K), windows[0])
plt.plot(range(K), windows[1])
plt.show()

(png, hires.png, pdf)

../_images/map_window_plot_04_00.png
map_values(y, ks, K, fill_value=nan)

Maps values from to multi-dimensional input array y to the indices ks in the multi-dimensional output array.

The mapping copies any dimension >1 of the input array y also to the output array, while the first dimension is enlarged from KS to K.

ks = [20, 50, 80]
K = len(y)
y_sparse = lm.map_values(y[ks], ks, K)
Parameters
  • y (array_like of shape=(KS, .. ) of floats) – list with values to map

  • ks (array_like of shape=(KS) of ints) – target indices in the target output vector, where to map the windows to

  • K (int) – Length of target vector

  • fill_value (scalar, optional) – Default value of target output vector elments, when no value is assigned.

Returns

  • output (ndarray of shape=shape of (K, …) of floats)

  • The functionality of this method is equivalent to the code sequence

  • .. code:: – out = np.full((K,) + np.shape(y)[2::], fill_value) out[ks] = y

KS : number of (time) indices in the list
K : number of samples

Examples

import lmlib as lm
import matplotlib.pyplot as plt
from lmlib.utils import gen_wgn, gen_sinusoidal

K = 300
y = gen_wgn(K, sigma=0.1) + gen_sinusoidal(K, k_period=80)
costsegment = lm.CostSegment(lm.AlssmPoly(2), lm.Segment(-40, 0, direction=lm.FORWARD, g=20))
separam = lm.SEParam(costsegment)
xs = separam.filter_minimize_x(y)

ks = [50, 200]

trajs = lm.map_trajectories(costsegment.trajectories(xs[ks]), ks, K)
plt.figure(figsize=(6, 1.5), tight_layout=True)
plt.title('map_trajectories(cost_segment, ...)')
plt.plot(range(K), trajs[0])
plt.plot(range(K), trajs[1])
plt.show()

(Source code, png, hires.png, pdf)

../_images/map_trajectories_plot_00_00.png
trajs = lm.map_trajectories(costsegment.trajectories(xs[ks]), ks, K, merge_ks=True)
plt.figure(figsize=(6, 1.5), tight_layout=True)
plt.title('map_trajs(cost_segment, ..., merge_k=True)')
plt.plot(range(K), trajs)
plt.show()

(png, hires.png, pdf)

../_images/map_trajectories_plot_01_00.png
seg_l = lm.Segment(-40, 0, direction=lm.FORWARD, g=100)
seg_r = lm.Segment(0, 40, direction=lm.BACKWARD, g=100)
compositecost = lm.CompositeCost((lm.AlssmPoly(2),), (seg_l, seg_r), F=[[1, 1]])

trajs = lm.map_trajectories(compositecost.trajectories(xs[ks], F=[[1, 1]]), ks, K)
plt.figure(figsize=(6, 1.5), tight_layout=True)
plt.title('map_trajectories(composite_cost, ...)')
plt.plot(range(K), trajs[0][0])
plt.plot(range(K), trajs[0][1])
plt.plot(range(K), trajs[1][0])
plt.plot(range(K), trajs[1][1])
plt.show()

(png, hires.png, pdf)

../_images/map_trajectories_plot_02_00.png
trajs = lm.map_trajectories(compositecost.trajectories(xs[ks], F=[[1, 1]]), ks, K, merge_ks=True)
plt.figure(figsize=(6, 1.5), tight_layout=True)
plt.title('map_trajectories(composite_cost, ..., merge_k=True))')
plt.plot(range(K), trajs[0])
plt.plot(range(K), trajs[1])
plt.show()

(png, hires.png, pdf)

../_images/map_trajectories_plot_03_00.png
trajs = lm.map_trajectories(compositecost.trajectories(xs[ks], F=[[1, 1]]), ks, K, merge_seg=True)
plt.figure(figsize=(6, 1.5), tight_layout=True)
plt.title('map_trajectories(composite_cost, ..., merge_seg=True)')
plt.plot(range(K), trajs[0])
plt.plot(range(K), trajs[1])
plt.show()

(png, hires.png, pdf)

../_images/map_trajectories_plot_04_00.png
class Segment(a, b, direction, g, delta=0, label=None)

Bases: object

Segment represents a window of finite or infinite interval borders used to select and weight signal samples in a cost function.

Segments are commonly used in combination with ALSSM signal models to select and weight the samples in cost functions, see CostSegment or CompositeCost. The window of a segment either has an exponentially decaying shape or is defined by the output of its own ALSSM model, denoted ast the window ALSSM. The selection of a window also controls the direction of a stable recursive cost computation (forward or backward).

In cunjunction with an ALSSM, a Segment leads to a cost function of the form

\[J_k(x) = \sum_{i=k+a}^{k+b} \gamma^{i-k-\delta}\big(CA^{i-k}x - y_i\big)^2 \ ,\]

and when additionally using sample weights \(v_k\), of the form

\[J_k(x) = \sum_{i=k+a}^{k+b} v_k {\alpha}_{k+\delta}(k+\delta) \big(CA^{i-k}x - y_i\big)^2 \ ,\]

with the sample weights \(v_k\) and the window weight \(\alpha_k(j)\) which depends on the sample weights, see Equation (14) in [Wildhaber2018]

See also [Wildhaber2018] [Wildhaber2019]

Parameters
  • a (int, np.inf) – Left boundary of the segment’s interval

  • b (int, np.inf) – Right boundary of the segment’s interval

  • g (int, float) – \(g > 0\). Effective number of samples under the window. This is used as a (more readable) surrogate for the window decay of exponential windows, see [Wildhaber2018].
    \(g\) is counted to the left or right of \(k+ \delta\), for for forward or backward computation direction, respectively.

  • direction (str) – Computation direction of recursive computations (also selects a left- or right-side decaying window)
    statespace.FORWARD or ‘fw’ use forward computation with forward recursions
    statespace.BACKWARD or ‘bw’ use backward computation with backward recursions

  • delta (int, optional) – Exponential window is normalized to 1 at relative index delta.

  • label (str, None, optional) – Segment label, useful for debugging in more complex systems (default: label = None)

Notes

The interval of the semgment includes both boundaries a and b into the calculations. i.e., if the sum runs over the interval \(k \in [a,b] \) samples.

Examples

>>> segment = lm.Segment(a=-20, b=-1, direction=lm.FORWARD, g=15)
>>> print(segment)
Segment : a:-20, b:-1, fw, g:15, delta:0, label: None
>>> segment = lm.Segment(a=-0, b=100, direction=lm.BACKWARD, g=15, delta=30, label="right-sided window with shift")
>>> print(segment)
Segment : a:0, b:100, bw, g:15, delta:30, label: right-sided window with shift

Attributes

Segment.a

Left boundary of the segment’s interval \(a\)

Segment.b

Right boundary of the segment’s interval \(b\)

Segment.direction

Sets the segment’s recursion computation direction

Segment.g

Effective number of samples \(g\), setting the window with

Segment.delta

Relative window shift \(\delta\)

Segment.label

Label of the segment

Segment.gamma

Window decay factor \(\gamma\)

Methods

window([thd])

Returns the per-sample window weighs

window(thd=None)

Returns the per-sample window weighs

The return values are the window weights \(\alpha_{\delta}(i) \quad \forall i \in [a, b]\) for a constant \(\gamma\). The window weight function is defined as

\[w_i = \gamma^{i}\]

For more details see [Wildhaber2018].

Parameters

thd (float, None, optional) – Threshold for infinite Segment boundaries. Crops any window weight below the threshold.

Returns

tuple (range, array)

  • range of length JR: relative index range of window with respect to semgent’s boundaries.

  • array of shape=(JR) of floats: per-index window weight over the reported index range

JR : index range length

Examples

import matplotlib.pyplot as plt
import lmlib as lm
import numpy as np

# Segment with a left-sided finite interval
segment = lm.Segment(a=-20, b=-1, direction=lm.FORWARD, g=20, delta=-10)
rel_range, window_weights = segment.window()

plt.stem(rel_range, window_weights)
plt.show()

(Source code, png, hires.png, pdf)

../_images/Segment_window_plot_00_00.png
# Segment with a right-sided infinite interval
segment = lm.Segment(a=0, b=np.inf, direction=lm.BACKWARD, g=10)
thd = 0.1
rel_range, window_weights = segment.window(thd)

plt.stem(rel_range, window_weights)
plt.axhline(thd, ls='--', c='k')
plt.show()

(png, hires.png, pdf)

../_images/Segment_window_plot_01_00.png
class CostSegment(alssm, segment, label=None)

Bases: lmlib.statespace.costfunc._CostBase

Quadratic cost function defined by an ALSSM and a Segment.

A CostSegment is an implementation of [Wildhaber2019] PDF (Cost Segment)

==================
Class: CostSegment (= 1 ALSSM + 1 Segment)
==================

  window weight
      ^
    2_|                   ___     exponentially rising window of segment
    1_|              ____/     <- normalized to 1 at index delta; to
    0_|    _________/             weight the cost function

               segment
           +----------------+
  alssm    |                |
           +----------------+

           |--|-------|-----|----> k (time, relativ to the segment reference index 0)
           a  0     delta   b

a,b : segment interval borders a and b in N, a < b

A cost segment is the quadratic cost function

\[J_a^b(k,x,\theta) = \sum_{i=k+a}^{k+b} \alpha_{k+\delta}(i)v_i(y_i - cA^{i-k}x)^2\]

over a fixed interval \(\{a, \dots, b\}\) with \(a \in \mathbb{Z} \cup \{ - \infty \}\), \(b \in \mathbb{Z} \cup \{ + \infty\}\), and \(a \le b\), and with initial state vector \(x \in \mathbb{R}^{N \times 1}\). For more details, seen in Section 4.2.6 and Chapter 9 in [Wildhaber2019] .

Parameters
  • alssm (_ModelBase) – ALSSM, defining the signal model

  • segment (Segment) – Segment, defining the window

  • label (str, None, optional) – text label, simplifies debugging in more complex cost terms (default: label=None)

Examples

Setup a cost segment with finite boundaries and a line ALSSM.

>>> alssm = lm.AlssmPoly(poly_degree=1, label="slope with offset")
>>> segment = lm.Segment(a=-30, b=0, direction=lm.FORWARD, g=20, label="finite left")
>>> cost = lm.CostSegment(alssm, segment, label="left line")
>>> print(cost)
CostSegment : label: left line
  └- Alssm : polynomial, A: (2, 2), C: (1, 2), label: slope with offset,
  └- Segment : a:-30, b:0, fw, g:20, delta:0, label: finite left

Attributes

CostSegment.alssm

ALSSM of CostSegment

CostSegment.segment

Segment of CostSegment

CostSegment.label

Label of the Cost

class CompositeCost(alssms, segments, F=None, label=None)

Bases: lmlib.statespace.costfunc._CostBase

Quadratic cost function defined by a conjunction one or multiple of ALSSM and Segment

A composite costs combines multiple ALSSM models and multiple Segments in form of a grid, where each row is a model and each column a Segment. The segments define the temporal relations of the models. Mapping matrix F enables or disables each ALSSM/Segment pair in each grid node; multiple active ALSSMs in one column are superimposed.

A CompositeCost is an implementation of [Wildhaber2019] PDF (Composite Cost)

====================
Class: CompositeCost (= M ALSSM's + P Segment's)
====================


M : number of ALSSM models
^
|                Segment[0]   Segment[1]    ...   Segment[P-1]
|              +------------+------------+--//--+--------------+
|  alssm[0]    |  F[0,0]    |            |      | F[0,P-1]     |
|              +------------+------------+--//--+--------------+
|    .         |     .      |     ...    |      |    .         |
|    .         |     .      |     ...    |      |    .         |
|              +------------+------------+--//--+--------------+
|  alssm[M-1]  |  F[M-1,0]  |            |      | F[M-1, P-1]  |
|              +------------+------------+--//--+--------------+``

               |------------|-------|----|------|--------------|--> k (time)
                                    0
                        (0: common relative reference
                         index for all segments)

P : number of segments

F[m,p] in R+, scalar on alssm's output. (i.e., 0 for inactive grid node,
1 or any other scalar factor for active grid node).

This figure shows a graphical representation of a composite cost, depicting the internal relationships between Segments, ALSSMs, and the mapping matrix F. F[m,p] is implemented as a scalar factor multiplied on the alssm’s output signal.

For more details, seen Chapter 9 in [Wildhaber2019]. The cost function of a composite cost is defined as

\[J(k, x, \Theta) = \sum_{p = 1}^{P}\beta_p J_{a_p}^{b_p}(k, x, \theta_p) \ ,\]

where, \(\Theta = (\theta_1, \theta_2,\dots, \theta_P)\) and the segment scalars \(\beta_p \in \mathbb{R}_+\).

Parameters
  • alssms (tuple, shape=(M,)) – Set of ALSSMs

  • segments (tuple, shape=(P,)) – Set of Segments

  • F (array_like of shape=(M, P)) – Mapping matrix \(F\), maps models to segments

  • label (str, None, optional) – Cost label (default: label = None)

M : number of ALSSMs
P : number of segments

Examples

>>> alssm_spike = lm.AlssmPoly(poly_degree=3, label='spike')
>>> alssm_baseline = lm.AlssmPoly(poly_degree=2, label='baseline')
>>>
>>> segment_left = lm.Segment(a=-50, b=-1, direction=lm.FORWARD, g=20, label="finite left")
>>> segment_middle = lm.Segment(a=0, b=10, direction=lm.FORWARD, g=100, label="finite middle")
>>> segment_right = lm.Segment(a=10, b=50, direction=lm.FORWARD, g=20, delta=10, label="finite right")
>>>
>>> F = [[0, 1, 0], [1, 1, 1]]
>>> cost = lm.CompositeCost((alssm_spike, alssm_baseline), (segment_left, segment_middle, segment_right), F, label='spike_baseline')
>>> print(cost)
CostSegment : label: spike_baseline
  └- ['Alssm : polynomial, A: (4, 4), C: (1, 4), label: spike', 'Alssm : polynomial, A: (3, 3), C: (1, 3), label: baseline'],
  └- [Segment : a: -50, b: -1, direction: fw, g: 20, delta: 0, label: finite left , Segment : a: 0, b: 10, direction: fw, g: 100, delta: 0, label: finite middle , Segment : a: 10, b: 50, direction: fw, g: 20, delta: 10, label: finite right ]

Attributes

CompositeCost.alssms

Set of ALSSM

CompositeCost.segments

Set of Segment

CompositeCost.F

Mapping matrix \(F\), maps models to segments

CompositeCost.alssm_count

Number of ALSSMs

CompositeCost.segment_count

Number of segments

CompositeCost.label

Label of the Cost

Methods

eval_alssm(xs, alssm_selection[, js])

Evaluation of the ALSSM for multiple state vectors xs.

window(segment_selection[, thds])

Returns for each selected segment its window generated by CostSegment.window().

trajectories(xs[, F, thds])

Returns the CompositeCost’s ALSSM’s trajectories (=output sequences for a fixed initial states) for multiple initial state vectors xs

ind_x(label)

Returns the indices of a single or multiple state variables within the full state vector x of an ALSSM by its label.

get_stacked_alssm(alssm_selection)

Returns a new generated stacked ALSSM with summed outputs of the ALSSMs in the CompositeCost corresponding the ALSSM selection.

eval_alssm(xs, alssm_selection, js=None)

Evaluation of the ALSSM for multiple state vectors xs.

See: eval()

Parameters
  • xs (array_like of shape=(XS, N [,S]) of floats) – List of state vectors \(x\)

  • alssm_selection (array_like, shape=(M,) of bool) – Each element enables (True, 1) or disables (False, 0) the m-th ALSSM in CompositeCost.alssms.

  • js (list of shape(J) of int, None, optional) –

Returns

s (ndarray of shape=(XS,[J,]L[,S]) of floats) – ALSSM outputs

N : ALSSM system order, corresponding to the number of state variables
L : output order / number of signal channels
M : number of ALSSMs
S : number of signal sets
XS : number of state vectors in a list

Examples

>>> alssm_spike = lm.AlssmPoly(poly_degree=3, label='spike')
>>> alssm_baseline = lm.AlssmPoly(poly_degree=2, label='baseline')
>>>
>>> segment_left = lm.Segment(a=-50, b=-1, direction=lm.FORWARD, g=20, label="finite left")
>>> segment_right = lm.Segment(a=0, b=50, direction=lm.FORWARD, g=20, label="finite right")
>>>
>>> F = [[0, 1], [1, 1]]
>>> cost = lm.CompositeCost((alssm_spike, alssm_baseline), (segment_left, segment_right), F, label='spike_baseline')
>>>
>>> xs = np.arange(3 * (4 + 3)).reshape(3, 4 + 3)
>>> alssm_selection = [1, 1]
>>> y = cost.eval_alssm(xs, alssm_selection=alssm_selection)
>>> print(y)
[[ 4.]
 [18.]
 [32.]]
>>> alssm_selection = [1, 0]
>>> K = 5
>>> ks = [0, 2, 4]
>>> y = cost.eval_alssm(xs, alssm_selection=alssm_selection, js=[-1, 0, 1])
>>> print(y)
[[[-2.]
  [ 0.]
  [ 6.]]
 [[-2.]
  [ 7.]
  [34.]]
 [[-2.]
  [14.]
  [62.]]]
window(segment_selection, thds=None)

Returns for each selected segment its window generated by CostSegment.window().

The segments are selcted by segment_selection.

Parameters
  • segment_selection (array_like, shape=(P,) of Boolean) – Enables (True, 1) or disables (False, 0) the evaluation of the p-th Segment in CompositeCost.segments using CostSegment.window()

  • thds (list of shape=(P,) of floats, scalar, optional) – List of per-segment threshold values or scalar to use the same threshold for all segments Evaluation is stopped for window weights below the given threshold. Set list element to None to disable the threshold for a segment.

Returns

list of shape=(P) of tuples (range, array) – Each element is a tuple with

  • range of length JR: relative index range of window with respect to semgent’s boundaries.

  • array of shape=(JR) of floats: per-index window weight over the reported index range

JR : index range length
P : number of segments

Examples

import matplotlib.pyplot as plt
import lmlib as lm

alssm_spike = lm.AlssmPoly(poly_degree=3)
alssm_baseline = lm.AlssmPoly(poly_degree=2)

segment_left = lm.Segment(a=-20, b=-1, direction=lm.FORWARD, g=10, label="left")
segment_middle = lm.Segment(a=0, b=9, direction=lm.FORWARD, g=5000, label="middle")
segment_right = lm.Segment(a=10, b=float('inf'), direction=lm.BACKWARD, g=5, delta=10, label="right")

F = [[0, 1, 0], [1, 1, 1]]
cost = lm.CompositeCost((alssm_spike, alssm_baseline), (segment_left, segment_middle, segment_right), F)

segment_selection = [1, 1, 1]
wins = cost.window(segment_selection, thds=[None, None, 0.02])

colors = ['b', 'g', 'r']
for win, segment, color in zip(wins, cost.segments, colors):
    markerline, stemlines, baseline = plt.stem(*win, label=segment.label)
    plt.setp(stemlines, color=color)
    plt.setp(markerline, 'markerfacecolor', color)
    plt.setp(markerline, 'markeredgecolor', color)
    plt.setp(baseline, color='k')

plt.legend()
plt.show()

(Source code, png, hires.png, pdf)

../_images/CompositeCost_window_plot.png
trajectories(xs, F=None, thds=None)

Returns the CompositeCost’s ALSSM’s trajectories (=output sequences for a fixed initial states) for multiple initial state vectors xs

Evaluates the CompositeCost’s ALSSM with the state vectors xs over the time indices defined by CompositeCost.segments. The segment’s interval boundaries limit the evaluation interval (samples outside this interval are set to 0). In particular for segments with infinite interval borders, the threshold option thds additionally limits the evaluation boundaries by defining a minimal window height of the CompositeCost.segments.

Parameters
  • xs (array_like of shape=(XS, N [,S]) of floats) – List of initial state vectors \(x\)

  • F (array_like, shape(M, P) of int, shape(M,) of int) – Mapping matrix. If not set to None, the given matrix overloads the CompositeCost’s internal mapping matrix as provided by the class constructor for CompositeCost. If F is only of shape(M,), the vector gets enlarged to size shape(M, P) by repeating the vector P-times. (This is a shortcut to select a single ALSSM over all segments.)

  • thds (list of shape(P) of floats, scalar, None, optional) – Per segment threshold limiting the evaluation boundaries by setting a minimum window height of the associated segments Scalar to use the same threshold for all available segments.

Returns

trajectories (list of shape=(XS) of tuples of shape=(P) of tuples (range, array)) – Each element in trajectories is a tuple with

  • range of length JR: relative index range of trajectory with respect to semgent’s boundaries

  • array of shape(JR, L, [S]): trajectory values over reported range.

Dimension S is only present in the output, if dimension S is also present in xs (i.e., if multiple signal sets are used)

JR : index range length
XS : number of state vectors in a list
N : ALSSM system order, corresponding to the number of state variables
S : number of signal sets
M : number of ALSSMs
P : number of segments

Examples

import matplotlib.pyplot as plt
import lmlib as lm

alssm = lm.AlssmPoly(poly_degree=2)

segment_left = lm.Segment(a=-16, b=-1, direction=lm.FORWARD, g=5, label='left segment')
segment_middle = lm.Segment(a=0, b=10, direction=lm.BACKWARD, g=10000, label='middle segment')
segment_right = lm.Segment(a=11, b=26, direction=lm.BACKWARD, g=5, delta=11, label='right segment')

F = [[1, 1, 1]]
cost = lm.CompositeCost((alssm,), (segment_left, segment_middle, segment_right), F)

xs = [[1, 0.02, 0.02], [0, 0.3, -0.2]]  # sets initial state vectors x_1 and x_2
trajs = cost.trajectories(xs)

_, axs = plt.subplots(2, 1)
for n in range(len(xs)):
    axs[n].set_title('Trajectories for xs=' + str(xs[n]))
    for p, segment in enumerate(cost.segments):
        axs[n].plot(*trajs[n][p], label=segment.label)
    axs[n].legend()

plt.suptitle("Demo: CompositeCost.trajectory(xs)")
plt.subplots_adjust(hspace=0.3)
plt.show()

(Source code, png, hires.png, pdf)

../_images/CompositeCost_trajectory_plot.png
ind_x(label)

Returns the indices of a single or multiple state variables within the full state vector x of an ALSSM by its label.

Parameters

label (str) – state variable label; if the ALSSM was stacked to a larger system, the delimiter (‘.’ dot-symbol) is used to address subsystems, e.g., alssm_1.x0

Returns

out (tuple of ints) – Returns the indices of the requested state variables within the state vector.

Examples

>>> import lmlib as lm
>>>
>>> a1 = lm.AlssmPoly(1, label='left_poly')
>>> a1.set_state_variable_label('offset', (0,))  # adds explicit name to state at index 0
>>>
>>> a2 = lm.AlssmPoly(1, label='right_poly')
>>>
>>> s1 = lm.Segment(a=-20, b=-1, direction=lm.FORWARD, g=2)
>>> s2 = lm.Segment(a=0, b=19, direction=lm.BACKWARD, g=3)
>>>
>>> F = np.eye(2)
>>> cc = lm.CompositeCost((a1, a2), (s1, s2), F)
>>> print(cc.ind_x('left_poly.offset'))
(0,)
>>> print(cc.ind_x('right_poly.x1')) # x1 is the default state name
(3,)
get_stacked_alssm(alssm_selection)

Returns a new generated stacked ALSSM with summed outputs of the ALSSMs in the CompositeCost corresponding the ALSSM selection.

Parameters

alssm_selection (array_like of shape=(M,) of bool) – Enables (True, 1) or disables (False, 0) the evaluation of the m-th Alssm in CompositeCost.alssms, respectively.

Returns

out (AlssmStackedSO) – Stacked ALSSM

class SEParam(cost_model, use_steady_state=False, params=None, segment_scalars=None, label=None)

Bases: object

Data container for squared error filter’s output

SEParam computes and stores intermediate values such as covariances, as required to efficiently solve least squares problems between a model-based cost function CCost and given observations. The intermediate variables are observation dependant and therefore the memory consumption of SEParam scales linearly with the observation vector length.

Main intermediate variables are the covariance W, weighed mean xi, signal energy kappa, weighted number of samples nu, see Equation (4.6) in [Wildhaber2019] PDF

If use_steady_state = false, variable \(W_k\) is also computed recursively for each sample \(k\) (rather slower computation), or, if use_steady_state = true, a common \(W_k = W_{steady}\) is used for all samples (faster computation). Note that using a common \(W_k\) potentially leads to border missmatch effects and to completely invalid results when samples have individual sample weights.

Parameters
  • cost_model (CostSegment, CompositeCost) – Cost Model

  • use_steady_state (bool, optional, default = False) – true: Indicates the use of \(W_k = W_{steady}\) in steady state for all samples. Using steady state instead of per-sample computation of \(W_k\) strongly reduces the computational load of filter().

  • params (tuple with elements of {'xi', 'kappa', 'nu'}, None, optional) – Selection of parameters to be computed, e.g., params = ('xi',) computes \(xi\) only, which is often sufficient to derive the initial state leading minimum squared error (if \(W\) is constant). The default value is ('xi', 'kappa', 'nu').

  • segment_scalars (array_like of shape=(P,) of floats, None, optional) – Factor weighting each of the P cost segments. If segment_scalars is not set, the weight is for each cost segment 1.

  • label (str, optional) – SEParam label (default: label = None)

P : number of segments

Examples

Filtering a single channel signal by a CostSegment with an polynomial ALSSMs and one Segment

import matplotlib.pyplot as plt
from lmlib.utils.generator import *
import lmlib as lm

K = 500
y = gen_sinusoidal(K, k_period=100).reshape(-1, 1) + gen_wgn(K, sigma=0.1).reshape(-1, 1)

alssm = lm.AlssmPoly(poly_degree=2)
segment = lm.Segment(-50, 0, lm.FORWARD, g=15)
cost = lm.CostSegment(alssm, segment)

param = lm.SEParam(cost)
xs = param.filter_minimize_x(y)
y_hat = alssm.eval(xs)
J = param.eval_errors(xs, range(K))

fig, axs = plt.subplots(2, 1, figsize=(6,3))
axs[0].set(xlabel='k', ylabel=r'$y$, $\hat{y}$')
axs[0].plot(range(K), y, label=r'$y$')
axs[0].plot(range(K), y_hat, label=r'$\hat{y}$')
axs[0].legend()
axs[1].set(xlabel='k', ylabel='squared error')
axs[1].plot(range(K), J, label=r'$J$')
axs[1].legend()

plt.tight_layout()
plt.show()

(Source code, png, hires.png, pdf)

../_images/SEParam_cost_segment_plot.png

Filtering a single channel signal by a CompositeCost with an polynomial ALSSMs and two Segments

import matplotlib.pyplot as plt
from lmlib.utils.generator import *
import lmlib as lm

K = 500
y = 2*gen_triangle(K, k_period=100).reshape(-1, 1) + gen_wgn(K, sigma=0.1).reshape(-1, 1)

alssm = lm.AlssmPoly(poly_degree=2)
segment_left = lm.Segment(-51, -1, lm.FORWARD, g=15)
segment_right = lm.Segment(0, 50, lm.BACKWARD, g=15)

cost = lm.CompositeCost((alssm,), (segment_left, segment_right), F=[[1, 1]])

param = lm.SEParam(cost)
xs = param.filter_minimize_x(y)
y_hat = alssm.eval(xs)
J = param.eval_errors(xs, range(K))

fig, axs = plt.subplots(2, 1, figsize=(6,3))
axs[0].set(xlabel='k', ylabel=r'$y$, $\hat{y}$')
axs[0].plot(range(K), y, label=r'$y$')
axs[0].plot(range(K), y_hat, label=r'$\hat{y}$')
axs[0].legend()
axs[1].set(xlabel='k', ylabel='squared error')
axs[1].plot(range(K), J, label=r'$J$')
axs[1].legend()

plt.tight_layout()
plt.show()

(Source code, png, hires.png, pdf)

../_images/SEParam_composite_cost_plot.png

Attributes

SEParam.W

Filter Parameter \(W\)

SEParam.xi

Filter Parameter \(\xi\)

SEParam.kappa

Filter Parameter \(\kappa\)

SEParam.nu

Filter Parameter \(\nu\)

SEParam.label

Label of SEParam

Methods

filter(y[, v])

Computes the intermediate parameters for subsequent squared error computations and minimizations.

minimize_x([constraint_matrix, offset_vector])

Returns the state vector x of the squared error minimization with linear constraints

minimize_v(constraint_matrix[, offset_vector])

Returns the vector v of the squared error minimization with linear constraints

filter_minimize_x(y[, v, constraint_matrix, …])

Combination of SEParam.filter() and SEParam.minimize_x().

eval_errors(xs[, ks])

Evaluation of the squared error for multiple state vectors xs.

filter(y, v=None)

Computes the intermediate parameters for subsequent squared error computations and minimizations.

Computes the intermediate parameters using efficient forward- and backward recursions. The results are stored internally in this SEParam object, ready to solve the least squares problem using e.g., minimize_x() or minimize_v(). The parameter allocation allocate() is called internally, so a manual pre-allcation is not necessary.

Parameters
  • y (array_like, shape=(K, [L [, S]])) – Input signal
    Single-channel signal is of shape =(K,)
    Multi-channel signal is of shape =(K,L)
    Multi-channel-sets signal is of shape =(K,L,S)

  • v (array_like, shape=(K,), optional) – Sample weights. Weights the parameters for a time step k and is the same for all multi-channels. By default the sample weights are initialized to 1.

K : number of samples
L : output order / number of signal channels
S : number of signal sets

minimize_x(constraint_matrix=None, offset_vector=None)

Returns the state vector x of the squared error minimization with linear constraints

Minimizes the squared error over the state vector x. If needed its possible to apply linear constraints with an (optional) offset. [Wildhaber2018] [TABLE V].

Constraint:

  • Linear Scalar : \(x=Hv,\,v\in\mathbb{R}\)

    known : \(H \in \mathbb{R}^{N \times 1}\)

  • Linear Combination With Offset : \(x=Hv +h,\,v\in\mathbb{M}\)

    known : \(H \in \mathbb{R}^{N \times M},\,h\in\mathbb{R}^N\)

See also minimize_v()

Parameters
  • constraint_matrix (array_like, shape=(N, M), optional) – Matrix for linear constraining \(H\)

  • offset_vector (array_like, shape=(N, [S]), optional) – Offset vector for linear constraining \(h\)

Returns

xs (ndarray of shape = (K, N [,S])) – Least square state vector estimate for each time index. The shape of one state vector x[k] is (N, [S]), where k is the time index of K samples, N the ALSSM order and S the number of multi-channel-sets.

K : number of samples
N : ALSSM system order, corresponding to the number of state variables
S : number of signal sets

minimize_v(constraint_matrix, offset_vector=None)

Returns the vector v of the squared error minimization with linear constraints

Minimizes the squared error over the vector v with linear constraints with an (optional) offset [Wildhaber2018] [TABLE V].

Constraint:

  • Linear Scalar : \(x=Hv,\,v\in\mathbb{R}\)

    known : \(H \in \mathbb{R}^{N \times 1}\)

    \(\hat{v}_k = \frac{\xi_k^{\mathsf{T}}H}{H^{\mathsf{T}}W_k H}\)

  • Linear Combination With Offset : \(x=Hv +h,\,v\in\mathbb{M}\)

    known : \(H \in \mathbb{R}^{N \times M},\,h\in\mathbb{R}^N\)

    \(\hat{v}_k = \big(H^{\mathsf{T}}W_k H\big)^{-1} H^\mathsf{T}\big(\xi_k - W_k h\big)\)

Parameters
  • constraint_matrix (array_like, shape=(N, M)) – Matrix for linear constraining \(H\)

  • offset_vector (array_like, shape=(N, [S]), optional) – Offset vector for linear constraining \(h\)

Returns

v (ndarray, shape = (K, M [,S]),) – Least square state vector estimate for each time index. The shape of one state vector x[k] is (N, [S]), where k is the time index of K samples, N the ALSSM order and S the number of multi-channel-sets.

K : number of samples
N : ALSSM system order, corresponding to the number of state variables
S : number of signal sets

filter_minimize_x(y, v=None, constraint_matrix=None, offset_vector=None)

Combination of SEParam.filter() and SEParam.minimize_x().

This method has the same output as calling the methods

separam.filter(y)
xs = separam.minimize_x()

but has a better performance.

eval_errors(xs, ks=None)

Evaluation of the squared error for multiple state vectors xs.

The return value is the squared error

\[J(x) = x^{\mathsf{T}}W_kx -2*x^{\mathsf{T}}\xi_k + \kappa_k\]

for each state vector \(x\) from the list xs.

Parameters
  • xs (array_like of shape=(XS,N[,S])) – List of state vectors \(x\)

  • ks (list of shape=(XS) of int, range, None, optional) – List of error evaluation indices. (selecting the right squared error parameters: \(W_k\), \(xi_k\), \(kappa_k\)) If ks is not given the range of the SEParam.xi is used for ks internally. (if SEParam.xi was not calculated SEParam.kappa is used instead.)

Returns

J (np.ndarray of shape=(XS,[,S])) – ALSSM output for each state vector

XS : number of state vectors in a list
N : ALSSM system order, corresponding to the number of state variables
S : number of signal sets

class ConstrainMatrix

Bases: numpy.ndarray

Builder class to set up matrix H as a linear contraint for the squared error minimization

Examples

>>> import lmlib as lm
>>> import numpy as np
>>>
>>> a1 = lm.AlssmPoly(1, label='left_poly')
>>> a1.set_state_variable_label('offset', (0,))
>>>
>>> a2 = lm.AlssmPoly(1, label='right_poly')
>>> s1 = lm.Segment(a=-20, b=-1, direction=lm.FORWARD, g=2)
>>> s2 = lm.Segment(a=0, b=19, direction=lm.BACKWARD, g=3)
>>> F = np.eye(2)
>>> cc = lm.CompositeCost((a1, a2), (s1, s2), F)
>>> H = lm.ConstrainMatrix(cc).constrain_linear(('left_poly.offset','right_poly.x0'), (1, 1), labels=('offset',)
>>>
>>> print(H)
[[1. 0. 0.]
 [0. 1. 0.]
 [1. 0. 0.]
 [0. 0. 1.]]
 >>> print(H.v_state_labels)
{'offset': (0,), 'v1': (1,), 'v2': (2,)}

Attributes

Methods

constrain_linear(indices, multipliers[, labels])

Returns a new instance of the Constrain Matrix H with the passed linear constraints.

set_v_state_label(name, indices)

Sets a new label for the v state variable index/indices in the x-label dictionary

constrain_linear(indices, multipliers, labels=(None, ))

Returns a new instance of the Constrain Matrix H with the passed linear constraints.

Parameters
  • indices (tuple of int, tuple of array_like) – Contains the position of the state vector variables which a constrain should be applied

  • multipliers (tuple of int, tuple of array_like) – Contains the value of constrain respectively to the order in indices

  • labels (tuple of str) – Contains the label v state respectively to the constrain

Returns

H (ConstrainMatrix) – Constrain matrix H

set_v_state_label(name, indices)

Sets a new label for the v state variable index/indices in the x-label dictionary

Parameters
  • name (str) – Label name

  • indices (tuple) – Tuple with index/indices

Examples