Several paradigms are available for developing nonlinear dynamic input-output models of processes. These models have the capability to describe pathological dynamic behavior and to provide accurate predictions over a wider range of operating conditions compared to linear models. ANNs were introduced in the previous section. Chapter 5 presents system science methods for nonlinear model development. Various other nonlinear model development paradigms such as time series models, Volterra kernels, cascade (block-oriented) models and nonlinear PLS have been developed. Extensions of linear empirical model development techniques based on time series models and PLS are introduced in this section to expand the alternatives available to build nonlinear models. Polynomial models, threshold models, and models based on spline functions can describe various types of nonlinear behavior observed in many physical processes. Polynomial models include bilinear models, state dependent models, nonlinear autoregressive moving average models with exogenous inputs (N ARM AX), nonlinear polynomial models with exponential and trigonometric functions (NPETM), canonical variate nonlinear subspace models, and multivariate adaptive regression splines (MARS). A unified nonlinear model development framework is not available, and search for the appropriate nonlinear structure is part of the model development effort. Use of a nonlinear model development paradigm which is not compatible with the types of nonlinearities that exist in data can have a significant effect on model development effort and model accuracy. Various nonlinear time series modeling paradigms from system identification and statistics literature are summarized in Section 4.7.1. A special group of nonlinear models based on the extension of PLS is presented in Section 4.7.2.

4.7.1 Nonlinear Input-Output Models in Time Series Modeling Literature

More than twenty nonlinear time series model structures have been proposed during the last four decades. They could be classified based on features such as the types of variables used in the model, and the way the model parameters appear in equations. The characteristic features of various nonlinear time series (NLTS) models are discussed in this subsection.

The three basic groups of variables used in NLTS models are:

1. Previous values of the dependent variable that yield autoregressive (AR) terms,

2. Sequences of independent and identically distributed (ii<£) random vectors (white noise) that provide moving average (MA) terms,

3. Input variables with nonrandom features that are called external (exogenous) (X) variables.

Volterra series models [624] do not utilize previous values of the dependent variable, while nonlinear autoregressive moving average models with exogenous variables (NARMAX) (Eqs. 4.171-4.174) use all three types of variables. Model structures are either linear or nonlinear in the parameters. Model parameter estimation task is much less computation intensive if the model parameters appear in a linear structure. This permits use of well-developed parameter estimation techniques for linear modeling paradigms. NARMAX, bilinear (Eq. 4.170), and threshold models (Eq. 4.177) are linear in the parameters, while exponential models are nonlinear in the parameters.

Volterra models have been utilized by Wiener [644] for the study of nonlinear systems by constructing transformations of Volterra series in which the successive terms are orthogonal. Expressing y{t) as a function of current and past values of a zero mean white noise process e(t)

H can be expanded as a Taylor series about the point a oo oo oo

1=1j=1fc=l where

When input u(t) and output y(t) are both observable, the Volterra series can be represented in terms of the input by replacing e(t) by u(t). If the system is linear, only the first derivative term is present and the model is completely characterized by the transfer function gi of the system. For nonlinear processes, additional terms in Eq. (4.167) must be included, and the generalized transfer functions concept is used [479].

Exponential models of order k have the basic form [211]

k y(t) = £ [ctj + (3j exp(—5(y(t - 1 ))2)y(t - j)] + e(t) (4.169)

where e(t) is a sequence of iid random variables, ctj, (3j, and 5 are model parameters. Since 8 is in the argument of the exponential term, the model estimation problem is computationally more challenging.

Bilinear models [394] cannot describe several types of nonlinearities such as limit cycles, but they have a simple form that can describe processes where products of two variables appear in equations derived from first principles. The general form of a bilinear model is p r m k y{t) + aiV(t ~ J) = S ^ ~ + S S ~ i)e(i ~ fi (4J70) j=1 j=o ¿=1 j=1

where cq = 1 and e(-) represents another variable or white noise. With suitable choices of parameters, bilinear models can approximate a "well behaved" Volterra series relationship over a finite time interval [83].

Nonlinear Polynomial Models. An important class of nonlinear polynomial models has been proposed by Billings and his coworkers [93, 95, 336]. Depending on the presence of autoregressive (AR), moving average (MA) terms, and/or exogenous (X) variables, they are denoted by acronyms such as NAR, NARX, or NARMAX. NARMAX models consist of polynomials that include various linear and nonlinear terms combining the inputs, outputs and past errors. Once the model structure, monomials to be included in the model, has been selected, identification of parameter values (coefficients of monomials) can be formulated as a standard least squares problem. The number of candidate monomials to be included in a NARMAX model ranges from about a hundred to several thousands for moderately nonlinear systems. Determination of the model structure by stepwise regression type of techniques becomes inefficient. An algorithm that efficiently combines structure selection and parameter estimation has been proposed [290] and extended to MIMO nonlinear stochastic systems [94].

The NARMAX model [336] of a discrete time multivariable nonlinear stochastic system with r inputs and m outputs is y(t) = f(y(t - 1), • ■ ■ , y(i - rij,), u(i - 1), ■ • • , u(t - nu), e(i - 1), • • • ,e(t-ne))+e(t) (4.171)

are the system output, input and noise, respectively, ny,nu, and ne are the maximum lags in the output, input and noise, respectively, {e(i)} is a zero mean iid sequence, and f(-) is some vector valued nonlinear function. NARMAX models can be illustrated by a NAR model

Vqif) = fq{yi{t - 1),-- • ,yi(t - ny),--- ,ym{t - l),--- ,ym(t-ny))

Writing /,(•) as a polynomial of degree I yields

E"' E ^)..llzn(t)---zH(t) + eq(t), q=l,m (4.174) ¿1—1 —1

where n = m x ny, zj(i) = yx{t - 1), z2(t) = yi{t - 2), • • •, and zmjly(t) = ym(t — ny). All terms composed of z^ (i), • • • , Zj, (t) in Eq. (4.173) are thus provided. Hence, for each q, 1 < q < m, Eq. 4.174 describes a linear regression model of the form

Was this article helpful?

## Post a comment