Data Filtering

All measurements are contaminated by some kind of errors (systematic biases, gross errors, and noise) that interfere with process operation and with data analysis for model development and process supervision. When the instrument is not properly calibrated, the average value of a series of measurements does not correspond with the true value of the measured variable. In this case we talk about a systematic bias. Gross errors are caused, for example, by malfunctions of the measuring system, providing unrealistic values of the measured variable. Noise can be classified as (see Fig. 26.2): high frequency noise, associated with intrinsic limitations of any instrument that cannot produce exactly the same value after a series of independent measurements, even if the measured variable is kept constant; medium frequency noise, due to process heterogeneity (turbulence and poor mixing); and low frequency noise, caused by process disturbances (environmental conditions, metabolic heat, bed drying etc.). The latter can be reduced by automatic process control, but high-to-medium noise should be reduced by signal processing, that is, by "data filtering".

In large-scale SSF bioreactors the solid bed is highly heterogeneous and its characteristics are time-varying (water content, biomass content, solid-gas interphase area, porosity etc.), hence it is difficult to infer its average conditions directly from the measurements. These effects mean that typical on-line readings, such as temperatures, gas flow rate or relative humidity, show significant noise and, during the agitation period, many outliers (gross errors), which appear due to the liberation of occluded gas and the electric interference of motor drives. Therefore, data processing is of utmost importance to operate this kind of bioreactor well (Peña y Lillo 2000). This is especially important if advanced control techniques have been implemented, since these control algorithms do not work without reliable process models, which in turn are obtained from good quality process data.

Here we will discuss simple filtering algorithms that can enhance the reliability of noisy measurements significantly.

Although the signals produced by the instruments are continuous (analog), usually control calculations are performed by digital microprocessors that can only operate with digital (discrete) signals. Hence, the analog signal provided by the instrument should be converted to a discrete signal, that is, a signal that has its values reported at regular time intervals. The time interval between two values is known as sampling time and should be provided by the Process Engineer. If the sampling time is too short, the computer control system will be overloaded. Therefore, the sampling rate will be bounded by the processing speed of the control device and by the number of control loops that the control device is handling. On the other hand, if the sampling time is too long, the converted discrete signal will not reproduce the real process dynamics given by the original analog signal. Hence, a compromise must be met and usually the Shannon theorem (Astrom and Wittenmark 1984) is used to find a lower limit to the sampling time. Table 26.2 shows common values of sampling times used in practice.

Controllable noise metabolic heat, bed drying, and environmental conditions

Was this article helpful?

0 0

Post a comment